Dolphin, the GameCube and Wii emulator - Forums

Full Version: [SUGGESTION] Increase Framerate Limit with Interpolation
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
You can't do interpolation live, it's literally impossible - you have to create a blend of the current frame *and the next one* to interpolate! That means an the absolute minimum latency of one frame, or 16ms for 60fps or 32ms for 30fps. Then you have the time it takes the system to combine the images to make the interpolation frame, which depends on how fast the system is and how the interpolation frame is rendered. The fastest method is to just smush the two frames together half and half, buuuut that's terrible and no one wants that! A proper interpolation system will analyze the differences between the two frames, then compare the differences and render an interpolation image that is a fair approximation of what the inbetween frame would have been if the source had rendered the frame. As you can imagine, that takes a lot of processing power, and that means a lot of time - over a hundred milliseconds in most cases! Some of them go up to 150ms, at 60fps, that's almost ten frames behind! That's laggier than playing brawl in WFC was!

Anyway, interpolation always adds latency, and it works best on non-interactive systems, like watching a video. On interactive content the latency is really bad!
(06-29-2016, 11:43 AM)MaJoR Wrote: [ -> ]You can't do interpolation live, it's literally impossible - you have to create a blend of the current frame *and the next one* to interpolate! That means an the absolute minimum latency of one frame, or 16ms for 60fps or 32ms for 30fps. Then you have the time it takes the system to combine the images to make the interpolation frame, which depends on how fast the system is and how the interpolation frame is rendered. The fastest method is to just smush the two frames together half and half, buuuut that's terrible and no one wants that! A proper interpolation system will analyze the differences between the two frames, then compare the differences and render an interpolation image that is a fair approximation of what the inbetween frame would have been if the source had rendered the frame. As you can imagine, that takes a lot of processing power, and that means a lot of time - over a hundred milliseconds in most cases! Some of them go up to 150ms, at 60fps, that's almost ten frames behind! That's laggier than playing brawl in WFC was!

Anyway, interpolation always adds latency, and it works best on non-interactive systems, like watching a video. On interactive content the latency is really bad!

Oh okay and yes, i know im using it for watching video and anime

Qaazavaca Qaanic

(06-29-2016, 11:43 AM)MaJoR Wrote: [ -> ]You can't do interpolation live, it's literally impossible - you have to create a blend of the current frame *and the next one* to interpolate! That means an the absolute minimum latency of one frame, or 16ms for 60fps or 32ms for 30fps. Then you have the time it takes the system to combine the images to make the interpolation frame, which depends on how fast the system is and how the interpolation frame is rendered. The fastest method is to just smush the two frames together half and half, buuuut that's terrible and no one wants that! A proper interpolation system will analyze the differences between the two frames, then compare the differences and render an interpolation image that is a fair approximation of what the inbetween frame would have been if the source had rendered the frame. As you can imagine, that takes a lot of processing power, and that means a lot of time - over a hundred milliseconds in most cases! Some of them go up to 150ms, at 60fps, that's almost ten frames behind! That's laggier than playing brawl in WFC was!

Anyway, interpolation always adds latency, and it works best on non-interactive systems, like watching a video. On interactive content the latency is really bad!
If you insert every 30fps frame after 1/60th of a second, and every time a 30fps frame is rendered, blend it and the previous frame, does that count as 1/30th delay or 1/60?
(06-29-2016, 11:43 AM)MaJoR Wrote: [ -> ]As you can imagine, that takes a lot of processing power, and that means a lot of time - over a hundred milliseconds in most cases! Some of them go up to 150ms, at 60fps, that's almost ten frames behind! 
This is not because of processing power. Delaying the output won't decrease the processing requirements. Maybe a bit because of lazy programming, but in general, this latency is required to interpolate between many frames.
Imagine you have some movement on screen that is accelerating. If you take 2 frames of this movement and just generate the average, or middle of that, the movement will not be smooth. It will accelerate for 1 frame, keep the speed for one, accelerate again for one frame and so on. This is why you need to analyse more than just 2 frames, which means more processing power required and more latency.
(06-29-2016, 03:39 PM)degasus Wrote: [ -> ]This is not because of processing power. Delaying the output won't decrease the processing requirements. Maybe a bit because of lazy programming, but in general, this latency is required to interpolate between many frames.

Is it possible to do this directly from GPU ?
The only useful thing that could come of this speculation is some sorta motion blur.

1) Frame 1 and 2 gather buffer data
2) Frame 3 render 1/4 of the scene in a spare sample pattern at 1/4 resolution
3) Frame 3 add sample data from step 1 and smear in the motion vector

Motion blur does sometimes help with 30FPS render targets but it's unpopular enough where it's doubtful that anyone would add it.
Pages: 1 2