I wasn't sure where to post this so I hope it's ok here.
I've got a couple of queries about FPS and VPS in terms of what the values mean.
Would I be right in thinking that if the VPS is showing as 60 and 100% speed then the emulator is running pretty much as a real machine would be (assuming it's supposed to be outputting at a max framerate of 60fps)?
If that's the case then with a VPS value of 60 the FPS value should be pretty much what a real machine would be outputting at that particular moment... right? It could be less than 60 but that's not a problem if (a) the game is locked to, say, 30FPS or (b) the game is a variable frame rate game.
Assuming I've not got any of the above wrong, would I be right in thinking that the video backend *shouldn't* affect the internal FPS? - i.e. if a particular section of a game runs at, say, 30FPS using OpenGL then you'd expect it to also run at 30FPS using D3D if the VPS in both cases was showing as 60 and the emulation speed was 100%.
The reason for my confusion is that I've been playing Rogue Ops recently and it's a variable framerate title. If I switch the frame limiter off I can run the game at over 200% speed so I thought I'd try out the overclocking feature to see if I could get 60FPS throughout (I can). However, before I did that I thought I'd try a couple of test sections first to see what I was getting without overclocking. I noticed I'd get a fairly consistent 30/60/100% (FPS/VPS/%) at the start of one level if I used OpenGL (my usual setting) but got 60/60/100% using D3D and just spinning around and general mucking about showed that it was definitely more fluid. If I turned frame dump on then the D3D backend matched the OpenGL backend - i.e. it drops to an effective rate of 30FPS.
Have I totally misunderstood the whole FPS, VPS situation or is this behaviour as you'd expect? In terms of my settings, I'm pretty certain that other than using 4x internal resolution I hadn't altered any other settings. I was using 4.0-5394 but I've just switched to 4.0-5718 and I'm getting the same result.
EDIT: OK, I assume it's something to do with the dual-core option being selected. If I disable it I get 30/30/50% using OpenGL (i.e. it effectively matches D3D's 60FPS..... right??!??). I'm still a bit confused about what the "proper" framerate should be but I guess that it's whatever the frame dumper produces.... right? Sorry, just a bit confused in general!
I've got a couple of queries about FPS and VPS in terms of what the values mean.
Would I be right in thinking that if the VPS is showing as 60 and 100% speed then the emulator is running pretty much as a real machine would be (assuming it's supposed to be outputting at a max framerate of 60fps)?
If that's the case then with a VPS value of 60 the FPS value should be pretty much what a real machine would be outputting at that particular moment... right? It could be less than 60 but that's not a problem if (a) the game is locked to, say, 30FPS or (b) the game is a variable frame rate game.
Assuming I've not got any of the above wrong, would I be right in thinking that the video backend *shouldn't* affect the internal FPS? - i.e. if a particular section of a game runs at, say, 30FPS using OpenGL then you'd expect it to also run at 30FPS using D3D if the VPS in both cases was showing as 60 and the emulation speed was 100%.
The reason for my confusion is that I've been playing Rogue Ops recently and it's a variable framerate title. If I switch the frame limiter off I can run the game at over 200% speed so I thought I'd try out the overclocking feature to see if I could get 60FPS throughout (I can). However, before I did that I thought I'd try a couple of test sections first to see what I was getting without overclocking. I noticed I'd get a fairly consistent 30/60/100% (FPS/VPS/%) at the start of one level if I used OpenGL (my usual setting) but got 60/60/100% using D3D and just spinning around and general mucking about showed that it was definitely more fluid. If I turned frame dump on then the D3D backend matched the OpenGL backend - i.e. it drops to an effective rate of 30FPS.
Have I totally misunderstood the whole FPS, VPS situation or is this behaviour as you'd expect? In terms of my settings, I'm pretty certain that other than using 4x internal resolution I hadn't altered any other settings. I was using 4.0-5394 but I've just switched to 4.0-5718 and I'm getting the same result.
EDIT: OK, I assume it's something to do with the dual-core option being selected. If I disable it I get 30/30/50% using OpenGL (i.e. it effectively matches D3D's 60FPS..... right??!??). I'm still a bit confused about what the "proper" framerate should be but I guess that it's whatever the frame dumper produces.... right? Sorry, just a bit confused in general!