(02-28-2014, 09:04 PM)tueidj Wrote: I can make a sample program to prove that it's definitely not deterministic, if you like. But applying simple common sense really should make it obvious; if you've got 2 copies of the same program being fed the same inputs but running at different speeds, they are in no way guaranteed to act the same. Otherwise go read the comments at the beginning of SystemTimers.cpp that explicitly states, "These update frequencies are determined by the passage of frames. So if a game runs slow, on a slow computer for example, these updates will occur less frequently." Hence a slow computer will have a coarser granularity timebase than a fast computer, even though the actual values may appear to stay in sync.They occur less frequently in real time, not emulated time. Everything happens less frequently if it runs slower, obviously. If it worked the other way, you would introduce the problem you are describing. If you feed the same inputs to a game, it WILL act the same every single time. There hasn't been a single non-savestate related desync in movies for as long as i've been around.
Thread Rating:
|
Nintendo WFC will shut down in May 2014
|
|
|
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)
