This is totally speculative. I doubt that the image would be perfect even on a high DPI display, especially in motion.
Plus, in many cases, good anti-aliasing is even important for IQ then a high native resolution, as the brain can detect even a slight shimmering.
But that's not even my point, the point is accuracy (!).
There are almost always side effects of this which people do not consider first hand.
Let's just assume 9xSSAA + high DPI display is absolutely perfect (!) in terms of edge resolution, texture shimmering etc., and texture sharpness in the distance just cannot be made better any further.
(All this i highly doubt!)
How can you exclude the possibility that there are rendering techniques or post processing effects that take advantage of a higher rendering accuracy?
An example:
How many people say the 24-bit color depth of their display is sufficient?
(There are many that even come up with bullshit like "the human eye cannot see the difference between more then 500 colors" and the like.)
Now hink of the color dithering in SweetFX, quote: "Applies dithering to simulate more colors than your monitor can display."
So why is this done, if the colors are already that "perfect" and no one can see the difference?
The anser is: "This lessens banding artifacts".
I hope this example made it somewhat clear what i mean.
There are sometimes some side effects in digitalisation accuracy that people do not consider at first glance.
Plus, in many cases, good anti-aliasing is even important for IQ then a high native resolution, as the brain can detect even a slight shimmering.
But that's not even my point, the point is accuracy (!).
There are almost always side effects of this which people do not consider first hand.
Let's just assume 9xSSAA + high DPI display is absolutely perfect (!) in terms of edge resolution, texture shimmering etc., and texture sharpness in the distance just cannot be made better any further.
(All this i highly doubt!)
How can you exclude the possibility that there are rendering techniques or post processing effects that take advantage of a higher rendering accuracy?
An example:
How many people say the 24-bit color depth of their display is sufficient?
(There are many that even come up with bullshit like "the human eye cannot see the difference between more then 500 colors" and the like.)
Now hink of the color dithering in SweetFX, quote: "Applies dithering to simulate more colors than your monitor can display."
So why is this done, if the colors are already that "perfect" and no one can see the difference?
The anser is: "This lessens banding artifacts".
I hope this example made it somewhat clear what i mean.
There are sometimes some side effects in digitalisation accuracy that people do not consider at first glance.