You can't do interpolation live, it's literally impossible - you have to create a blend of the current frame *and the next one* to interpolate! That means an the absolute minimum latency of one frame, or 16ms for 60fps or 32ms for 30fps. Then you have the time it takes the system to combine the images to make the interpolation frame, which depends on how fast the system is and how the interpolation frame is rendered. The fastest method is to just smush the two frames together half and half, buuuut that's terrible and no one wants that! A proper interpolation system will analyze the differences between the two frames, then compare the differences and render an interpolation image that is a fair approximation of what the inbetween frame would have been if the source had rendered the frame. As you can imagine, that takes a lot of processing power, and that means a lot of time - over a hundred milliseconds in most cases! Some of them go up to 150ms, at 60fps, that's almost ten frames behind! That's laggier than playing brawl in WFC was!
Anyway, interpolation always adds latency, and it works best on non-interactive systems, like watching a video. On interactive content the latency is really bad!
Anyway, interpolation always adds latency, and it works best on non-interactive systems, like watching a video. On interactive content the latency is really bad!
AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 23H1 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12