Your article shows that the difference between the i5 2500k is unnoticeable, and in most of the tests the difference compared to the i7 was also almost unnoticeable.... not to mention these tests were run on SLI/Crossfire at 1440 and I'll be playing on 1080 on one card (in which those test results the FPS across CPUs were almost the same).... I think these tests are proving my point?
This is a quote from the link you posted
"Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.
DiRT 3 conclusion
Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. "
This link was supporting my argument, I was under the assumption you didn't agree with my argument....
This link covers FPS and frame time on 1080p crysis 3, and the FX performed almost identical to the i7...
http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/9
"Most of the time cpu load is very low and the bottleneck is on the gpu side." Which means the cpu isn't really all that important.
I concede, it seems as though Intel CPUs are in fact faster, but most of the time, unnoticeably faster when being used in real world conditions.
For some reason he's linked to an article with framerate data, and not frametime data. As he's explained, if framerate data shows a small difference, frametime data will show a much larger difference.
If you try to say that you won't notice if the frametime is high, then you're probably wrong. The stuttering is noticeable, but you won't necessarily be able to tell what's happening is stuttering, just that the game twitches sometimes or goes through phases of just being slightly less responsive.
(06-30-2013, 10:54 PM)AnyOldName3 Wrote: [ -> ]For some reason he's linked to an article with framerate data, and not frametime data. As he's explained, if framerate data shows a small difference, frametime data will show a much larger difference.
If you try to say that you won't notice if the frametime is high, then you're probably wrong. The stuttering is noticeable, but you won't necessarily be able to tell what's happening is stuttering, just that the game twitches sometimes or goes through phases of just being slightly less responsive.
I posted a link in my previous post showing frame time comparing the i7s and the FX 8350, and they were almost identical.
omarman Wrote:I posted a link in my previous post showing frame time comparing the i7s and the FX 8350, and they were almost identical.
You call that nearly identical? That's a pretty substantial difference in most of those games.
omarman Wrote:This link was supporting my argument, I was under the assumption you didn't agree with my argument....
Neither of us originally argued whether it was "good enough". I said that ivy bridge/haswell was the best choice for PC gaming and emulation and you disagreed on the PC gaming part. "Slightly better" or "much better" are both better. So it doesn't really matter how much better it is.
To make this clearer:
Argument 1: Current Intel i5 cpus are better than the FX 8350 for PC gaming because they perform better with PC games.
Evidenced by: Frametime and framerate benchmarks from various sources (tomshardware, anandtech, xbitlabs, techreport, etc.)
Argument 2: Current Intel i5 cpus are better than the FX 8350 for emulation because they perform better with emulators.
Evidenced by: PCSX2 and dolphin benchmarks conducted by their respective community forums.
This explains why people recommend them on gaming and emulation forums. From what I gather you posted the link to that youtube video because you disagreed with argument 1. After I refuted it you no longer disagree with either argument. Am I correct?
(07-01-2013, 05:42 AM)NaturalViolence Wrote: [ -> ]omarman Wrote:I posted a link in my previous post showing frame time comparing the i7s and the FX 8350, and they were almost identical.
You call that nearly identical? That's a pretty substantial difference in most of those games.
omarman Wrote:This link was supporting my argument, I was under the assumption you didn't agree with my argument....
Neither of us originally argued whether it was "good enough". I said that ivy bridge/haswell was the best choice for PC gaming and emulation and you disagreed on the PC gaming part. "Slightly better" or "much better" are both better. So it doesn't really matter how much better it is.
To make this clearer:
Argument 1: Current Intel i5 cpus are better than the FX 8350 for PC gaming because they perform better with PC games.
Evidenced by: Frametime and framerate benchmarks from various sources (tomshardware, anandtech, xbitlabs, techreport, etc.)
Argument 2: Current Intel i5 cpus are better than the FX 8350 for emulation because they perform better with emulators.
Evidenced by: PCSX2 and dolphin benchmarks conducted by their respective community forums.
This explains why people recommend them on gaming and emulation forums. From what I gather you posted the link to that youtube video because you disagreed with argument 1. After I refuted it you no longer disagree with either argument. Am I correct?
Now I agree with both arguments, but I began a new one.
Argument 3: Although Intel CPUs are better than AMD CPUs, you cannot notice the difference in real world settings. (eg; 1 GPU, 1080p, fps cap at 30/60)
(07-01-2013, 05:20 AM)omarman Wrote: [ -> ]I posted a link in my previous post showing frame time comparing the i7s and the FX 8350, and they were almost identical.
I literally only clicked to the next game and they are not even close to identical.
http://techreport.com/review/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed/10
Edit: Ninja'd
But for your 3rd argument you're saying that you can't notice a difference between the 2 cpus with a fps cap at 30 or 60. I'd say 30 definitely no. Even with micro stuttering you won't be able to tell since the framerate is at a low 30. But at 60, it could still be noticeable. I'm not saying it will be, but it could.
omarman Wrote:Argument 3: Although Intel CPUs are better than AMD CPUs, you cannot notice the difference in real world settings. (eg; 1 GPU, 1080p, fps cap at 30/60)
This isn't really specific enough to argue. Human perception is different for everybody. Some people are far more sensitive to stuttering than others. Framerate and the game being played are also big variable factors that you have to account for that could swing things either way.
(07-01-2013, 06:34 AM)NaturalViolence Wrote: [ -> ]omarman Wrote:Argument 3: Although Intel CPUs are better than AMD CPUs, you cannot notice the difference in real world settings. (eg; 1 GPU, 1080p, fps cap at 30/60)
This isn't really specific enough to argue. Human perception is different for everybody. Some people are far more sensitive to stuttering than others. Framerate and the game being played are also big variable factors that you have to account for that could swing things either way.
The argument is supported by the fact that most PC games will show the same frame time and FPS on either AMD or Intel when utilizing one GPU at 1080p with a fps cap. If these bench mark tests had an fps cap, all of the CPUs would show 60 or 30. That's what I'm trying to get at. And as far as slowdowns, the intels have better frametime, but at the milisecond level, which is what i mean by not being able to notice it. Biologically, it takes at least 10 miliseconds to react to anything. Besides different people's ability to sense frame drops, biologically if the difference in frametime of CPUs is 10 miliseconds or lower, no human will be able to tell the difference, and therefor it is arguable.
omarman Wrote:The argument is supported by the fact that most PC games will show the same frame time and FPS on either AMD or Intel when utilizing one GPU at 1080p with a fps cap. If these bench mark tests had an fps cap, all of the CPUs would show 60 or 30.
Why would you play games with an fps cap?
And if you cap them at 60 fps you're definitely going to still see a significant difference in frame times considering 60 fps is only 16.67 ms per frame.
omarman Wrote:And as far as slowdowns, the intels have better frametime, but at the milisecond level, which is what i mean by not being able to notice it.
Are you reading these correctly? They're showing how often the framerate drops and how severe the drops are. Measuring average frametime would effectively be the same as measuring average framerate. 50 ms is 20 fps. Frequent increases in framtimes to 50ms is going to be noticeable. Just look at the graphs!
Another way to look at it is 50 ms is 3 refresh cycles of waiting for one frame.
omarman Wrote:Biologically, it takes at least 10 miliseconds to react to anything. Besides different people's ability to sense frame drops, biologically if the difference in frametime of CPUs is 10 miliseconds or lower, no human will be able to tell the difference, and therefor it is arguable.
Um....no. Where did you get that number from?
Keep in mind that reaction times and visual perception are two very different things.
In the end, it all depends on how much you want to spend and how good are the results for you.
go for AMD if you just want to play
go for Intel if you don`t wan to lose any single frame, even if you really can`t notice it.
looking for "perfection" isn`t gonna get you anywhere.