Well this is just for you mr. screenshots and benchmarks.
amd turion II x2 P540 @2.4ghz
4GB of ddr3 1066
amd 512MB GDDR3 5470m
vs
core 2 duo E7500 (underclocked via Pstate to 2.4ghz)
4GB of ddr2 800
amd 1GB GDDR3 4670 (@stock settings)
Using this dolphin test
set to 3x IR, 4x MSAA, 16x AF
4670:
5470M
now for just 3x IR, 0x AA/AF
4670:
5470M:
Now hmmmm....
5470m vs 4670, there is only 2-3 fps difference in benchmarks??? and its a higher demand???? less delta at a higher demand??? blasphemy!!!!!!!!
You want to know why it is? because the demand is so far above both cards, there will be little difference (something you cant understand).
When 1 card is within ability to perform at the settings, while one isnt (at high or lower settings/resolutions), there will be a much larger gap between the 2 cards.
When both cards are able to perform at the settings, there is usually less delta between them, but it really goes on a case by case basis, as this sub category breaks into many others of like whether one card is hitting so high of an fps its giving diminishing returns and etc.
The reason a 5470m and a 4670 (or any other better graphics card) will perform identical in something like this dolphin test, is becuase emulation will always be held back by your cpu (well assuming you meet the actual requirements for the IR/settings you are using). Since both can play at 1x resolution, the cpu will always hold back the graphics. As at say 1x IR something like a 4670 could probably get like 200fps, but getting that with any cpu is almost impracticable.
When the graphics cards arnt being pushed extremely past their limits, you can get a better idea of the differences between them.
So i choose 3x IR which would be probably just enough to cause less then 100% speed with the 4670. This way the 4670 is being pushed to its limit w/o major slow down, while the 5470m is being pushed way past its means to play at 100% speed. (giving the largest delta between the 2 cards)
Even at this test, the difference were noticeable between the 4670 and 5470m, but its not a super huge margin. and this is a 4670 vs a 5470m we are talking about. the 4670 is 2x better then a 3650.... so take the difference between the 4670 vs 5470m and divide by 2 to give a some what guess at the difference between a 5470m vs 3650.
1) 24-17= 7fps difference, divide by 2, 3.5fps difference in 5470m vs 3650
2) 30-22= 8 fps difference, divide by 2, 4 fps difference between 5470m vs 3650
3) 25-17 = 8 fps difference, again 4fps difference between 5470m vs 3650.
This will be probably about the largest delta between a 5470m and a 3650 in dolphin. like 3-4fps difference. wouldnt you say that the 5470m would perform "PROBABLY ABOUT" the same?
and that the 5470m can easily play at 1x IR w/o any slow downs, as to not hold back any cpu from its potential?
(which that is the point you were disputing with me????)
All my statement was, was an educated guess based on performance of cards i have experience with. and it turns out im not too far off from reality (like you are).
This is the difference between real testing vs JUST basing your statistics on NOTHING but other peoples benchmarks.
You earlier asked whats the difference? well the fact that you are physically present for the test and can 100% see everything going on. Which ofcourse you cant know for 100% certainty my testing (just like any other benchmarks you find online), but thats why you do your own testing, and base your knowledge on something more tangible. (liek ive seen the 4670 and 5470m in real play, and thats how i base this off of, rather then using just benchmarks which time and time again even when perfectly recreated, never turn out the exact same way)
but anyway, im moving on now (i think i know why squall doesnt really like dolphin community much)
kthxbye