Weird because avatar is definitely outputting at 60 frames per second, my blu-ray player and tv both have the option to switch between 24 and 60 on hdmi, and windows says that the content is 60 fps. Not to mention my hd camcorder records at 60 fps, well I could go on and on but I know it won't make a difference so I won't. My t.v definitely handles 24 frames per second perfectly, DVDs look great. Hell I just realized whenever I burn an hd movie with tmpgenc it specifically says on the drop down list (DVD, bitrate 8000, 720 x 480, 24 fps) (Blu-ray, bitrate 60000, 1920 x 1080, 60 fps). I believe you but it just doesn't make sense that every piece of hardware, software, and content I have ever gotten that is related to HD to blu-ray seems to be designed for 60 fps.
Well, everything
else besides movies are usually rendered and shot on 60fps (or in the case of games, potentially even more) so it would make sense to think that.
I don't recall a single video game that would output at 24 frames per second.
For example sports events shot in high definition obviously aren't capped at 24fps, and in fact need a very high refresh rate to look smooth. Also user created amateur movies shot with handheld cams are almost never shot at 24p. It's just the standard for hollywood bluray films.
Here is a very useful link that will explain the phenomena.
http://forum.blu-ray.com/blu-ray-technology-news/102488-23-976p-24p-blu-ray.html
When you say you're getting 60fps, that's not entirely incorrect. When you view 24p content in 60hz, the TV is replicating earlier two earlier frames to create fluider motion, then replicates the three next frames. So in theory, you *are* viewing 60 frames per second (artificially).
Bluray 24p movies are best viewed at 120hz, because it can replicate 5 previous frames at once (and is divisible by 24). By natural 2:2 telecine pulldown and upconversion, your TV set can for example round up 24p source to 25 frames per second and display it at 50fps (or, as very likely in your case, converts 24p source material via 3:2 pulldown & duplicates frames for 60 fps).
Also, yes the human eye can percieve frames at much higher speed than 24 per second. Some sensitive people even notice the flicker on a 50hz screen. Humans can reportedly detect flickering even up to 220 frames per second in controlled repeated eye retina activation tests, after which it makes little difference. So you are correct in implying that some people (such as yourself) can notice the difference between 60 and 120fps.
It is my honest belief that if everyone was presented with the same content at 60 fps and 120 fps side by side they would notice the huge difference unless they had really aweful vision.
Quote:When you say you're getting 60fps, that's not entirely incorrect. When you view 24p content in 60hz, the TV is replicating earlier two earlier frames to create fluider motion, then replicates the three next frames. So in theory, you *are* viewing 60 frames per second (artificially).
Bluray 24p movies are best viewed at 120hz, because it can replicate 5 previous frames at once (and is divisible by 24). By natural 2:2 telecine pulldown and upconversion, your TV set can for example round up 24p source to 25 frames per second and display it at 50fps (or, as very likely in your case, converts 24p source material via 3:2 pulldown & duplicates frames for 60 fps).
I understand this but it is odd that...ok you know what I'm going to give you an example. N64 games running in emulators like project64 run at their original framerate, ocarina of time for example runs at 20 fps but since I'm emulating it I can set the resolution to 1920 x 1080. Now my tv allows me to set the refresh rate to 60 Hz or 24 Hz in the settings or have it configure it automatically based on the content. Their is no difference whatsoever between 24 Hz and 60Hz when I;m playing a video game at a low framerate like that but with a blu-ray movie their isn't just a difference, it's practically unwatchable at 24 hz. So if this is correct and all commercial blu-ray movies are 24 fps then why does this happen? And why would high resolution content not have terrible stuttering at such a low framerate? This defies the traditional logic in how our eyes work.
I mean look at stereoscopic/colorscopic 3d for example. In the late 90s we started getting video cards with support for it but then crts starting getting replaced by lcds with a much lower refresh rate. Stereoscopic/colorscopic died before it ever took off because all of the hardware vendors said that at 60Hz stereoscopic was unwatchable due to the very noticeable stuttering in stereoscopic and the very low max framerate in coloscopic. They said you needed at least 120 Hz since the actual framerate was essentially being cut in half. So now that we have managed to get back up to 120 hz with lcds suddenly all the vendors care again. But your telling me that with high resolution content 24 frames per second is unnoticeable?
Sorry to be so annoying about this but I'm really curious about it and you seem to know a lot more about it than I do. Also that thread you linked did not explain why blu-ray content is 24 fps at all. Although I think I know, if I'm not mistaken all films are still shot with film. Then the film is digitized. Since the standard format for film was 24 slides or frames per second we still have that standard. But since even the old 15mm film has over 6000 vertical lines of resolution even older movies can be remastered in a high resolution.
(06-14-2010, 08:24 AM)NaturalViolence Wrote: [ -> ]I understand this but it is odd that...ok you know what I'm going to give you an example. N64 games running in emulators like project64 run at their original framerate, ocarina of time for example runs at 20 fps but since I'm emulating it I can set the resolution to 1920 x 1080. Now my tv allows me to set the refresh rate to 60 Hz or 24 Hz in the settings or have it configure it automatically based on the content. Their is no difference whatsoever between 24 Hz and 60Hz when I;m playing a video game at a low framerate like that but with a blu-ray movie their isn't just a difference, it's practically unwatchable at 24 hz. So if this is correct and all commercial blu-ray movies are 24 fps then why does this happen? And why would high resolution content not have terrible stuttering at such a low framerate? This defies the traditional logic in how our eyes work.
24p bluray films and any material shot at 24fps will generally look bad on any TV set or monitor that doesn't incorporate true24p technology which usually means the panel has to support 120hz or higher, such as Sony Bravia. If the image is interpolated or not mapped by 5:5 (120 / 24) telecine pulldown, it will look laggy and stutter.
My Samsung TV set for example is 60hz and can't support true24p. Hence movies from bluray sources look jerky and not very enjoyable at all when switching to 24 hertz viewing mode. I'd take a quick demonstration film but the hdd is totally full. The only way to make films look tolerable is to artificially up the amount of replicated frames shown to 60 by switching to 3:2 pulldown ( 24 * 2.5) 60hz mode.
It's common for bluray films to look bad and slow if shown at 24hz on an older HDTV. The only way to make it look reasonable & normal is to up the refresh rate of the source video from 24hz to 60hz so that 250% more frames are replicated each second (i.e. giving you 60fps)
Quote:It's common for bluray films to look bad and slow if shown at 24hz on an older HDTV. The only way to make it look reasonable & normal is to up the refresh rate of the source video from 24hz to 60hz so that 250% more frames are replicated each second (i.e. giving you 60fps)
But why does this only happen with blu-ray movies? Why do video games seem to be immune to this? A video game rendered at 20 fps looks just as good at 24 Hz as 60 or 120.
Also:
Quote:And why would high resolution content not have terrible stuttering at such a low framerate?
Wouldn't our eyes notice the extremely low framerate? I can sure as hell notice it in a video game.
Quote:But why does this only happen with blu-ray movies? Why do video games seem to be immune to this? A video game rendered at 20 fps looks just as good at 24 Hz as 60 or 120.
Quote:Wouldn't our eyes notice the extremely low framerate? I can sure as hell notice it in a video game.
Perception of motion, and eyevision itself is an individual experience and largely based on how quickly your brain processes the information gained by the activation of the retina cells.
[1]
Generally anything slower than 16 frames per second will not look like a fluid moving picture but a series of slowly moving images to most people. Of course the more frames are shown the smoother the animation of images will appear to the eye.
A video game can be shown at 25fps and still appear to be in fluid motion because unlike a movie, it's lacking realistic lighting, details and models. A modern game like Heavy Rain on the ps3 would be completely unplayable and unrealistic at sub-30-50 fps rates due to realistic motion blur, level of detail, etc...
Old shooter games for the PC can often be played at even 300-400fps because there is very little to render. Illusion of motion can be created on these games with less effort than on a game like God of war or Alan wake.
So why aren't all new video games just being rendered at the highest possible speed? Because it's not technically possible. It would require very powerful hardware to render newer console games at constant 120 or higher frames per second. Many games are even struggling to keep up with the 60fps requirement that the publishers require for new releases due to limitations of the console. Even the best consumer GPUs money can buy in 2010, can't render a highly detailed game such as Crysis at maximum details with constant 60fps performance.
Movies could be shot at higher speeds than 23.976 frames per second, but the vast majority of film directors, if not more or less all film directors, have chosen to use this format. In a movie theatre, the film's image is instantly projected on the screen. Unlike on a computer screen or television, there are no lines drawn horizontally at all. The entire image appears at once which allows for smooth playback without flickering at roughly 24fps.