Dolphin, the GameCube and Wii emulator - Forums

Full Version: Just built my PC, is it fast enough?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14
(06-29-2013, 02:38 PM)omarman Wrote: [ -> ]All I ask is that you watch it. Be open minded.
http://www.youtube.com/watch?v=4et7kDGSRfc
We are being open-minded.

You asked, people answered.
People stated the facts.
they told ya what's gonna perform better in dolphin.

Games that are coded for PC's will usually get better frame rates if you have a better GPU, but in dolphin,
Having a better cpu will give you more performance but you also need a gpu enough for your intended graphics settings.

Check this out:

Dolphin CPU Hierarchy:
http://forums.dolphin-emu.org/Thread-dolphin-cpu-hierarchy

Minimum specs - video card:
http://forums.dolphin-emu.org/Thread-minimum-specs-video-card

(GPU) Using higher Internal Resolutions (IR):
http://forums.dolphin-emu.org/Thread-gpu-using-higher-internal-resolutions-ir

Now, I really shouldn't have posted those coz you've probably read them already.. and that they're on the same section of the forum.
If you really have read those and still have a question or two, feel free to ask. Wink
Why do I have this feeling that you didn't watch it >___>
(06-29-2013, 03:04 PM)omarman Wrote: [ -> ]Why do I have this feeling that you didn't watch it >___>
I watched it.
What people are telling you here is that Intel processors will be better than AMD's offering at the same clock.
And at the stock clock, 3570K's are running @ 3.4 - 3.8 GHz and the FX 8350 is running @ 4.0 - 4.2 GHz.

In Pc's, some games will work better with either processors and thus one game will work better than the other.
In Pure performance per clock, the 3570K wins.
In Pure performance per core, the 3570K wins.

But the 8350 will work better with multi threaded applications. Wink

Don't tell us that "8350's are great for future proofing because of the number of cores" coz by the time that games and apps are well optimized for octa-cores,
the Vishera 8350 will be obsolete. (Assuming that games would really be properly scaled across that number of cores without developers scratching their heads)

And again, you're in a forum in which talks about the Dolphin emulator and it is proven that Intel Processors works better in it than competitors' offerings.

If you're still talking about that vid, then read this ^ post again.
omega_rugal Wrote:people here worship intel, don`t waste your time.

....sigh. You're not helping.

omarman Wrote:This is as text form as it's going to get, but if you do watch the video, you will at least change your mind a little about AMD. I understand you are trying to not be biased, I can see it, you feel as though intel is statistically better based off tests and not on your opinions. I'm trying to show you, that you may be incorrect, based off his benchmarks.

I don't see any data there, at all. Why doesn't this man write down his findings if he's confident in his conclusions? Youtube videos are a horrible way to get across information. And an even worse way to display it.

I'll watch it tomorrow. I really doubt he's somehow refuted the dozens maybe hundreds of independently conducted benchmarks done by different sources that have all produced matched results. And have all concluded the same thing.

I really wish he would just write down his findings in an article and provide the data if he wants us to listen to him instead of everyone else. It's not that hard. Just write down what you did and what the results were. Then people can easily perform the same tests and verify it. I don't want to listen to some guy I don't know drone on about stuff I don't need to know or care about. Get to the bloody data! Everything else is filler. Or at least provide a section of your video (or a separate video) that directly discusses the data. So far he's just bitching about some past drama in his community that I couldn't care less about.
If you can survive the filler and make it to the information, you will find shocking results.
Indeed they are. It's really annoying having to locate specific benchmarks in the video every time I want to compare his data with other sources. He really needs to write things down.

His results are outrageous. In most of his tests the FX 8350 is anywhere between 50-100% better in framerate as the i5 3570K at stock and OC. And most of the games he chose are GPU intensive games that shouldn't be bottlenecked by the cpu anyways, especially at high resolutions. And some of them are games that only use 1-3 cores..... None of this lines up. But the thing that makes me most doubtful is the fact that his results don't match up with other organizations at all. He mentions at the beginning how most other hardware review sites use low resolution tests for gaming. Which is true, that's done to shift the bottleneck to the cpu side to better evaluate cpu performance. But the fact is there have been sites that have done high resolution/settings tests and had the same results. Even some that have done the exact same tests as him and gotten completely different results.

Tomshardware for example also tested skyrim at max settings 1080p with the same hotfixes from microsoft. Their build was nearly identical except they used a GTX 680 instead of a GTX 670.

Here are logans stock results for the test:
FX 8350: 85 fps
i5 3570K: 77 fps

Now tomshardware doing the exact same test:
FX 8350: 67
i5 3570K: 87

But they're not the only ones. I'm digging through benchmarks on google right now and xbitlabs and anandtech also testing skyrim at max settings on these two cpus with the hotfixes and got similar results. They used 1680 x 1050 but considering that 1920 x 1080 is only a 17.5% increase in resolution the results shouldn't be that far off. How do you or they explain these differences considering that all of the other sources have been peer reviewed by each other?

The other problem is that not only did he not compile his data, he didn't write down his settings, or do anything else that would allow people to reproduce his tests! This makes him even less credible.

He also didn't state whether he used input capture or not to reproduce the exact same movements.

I'm extremely skeptical of his data. He shouldn't be getting those numbers unless something is very wrong with one or both systems. The only thing I can think of off the top of my head is a bios issue causing poor pci-e bandwidth.

I'll keep digging when I have time tomorrow but so far his results aren't matching up with anyone else. They're way off from everyone else in fact. If he's the only one getting these results quite frankly there isn't much to say. I mean surely you can see my logic here? I'm not crazy for thinking that all of this screams "it's more likely that he did something wrong". If he wants to be taken seriously after putting out such crazy numbers he needs to write down everything, particularly his settings. That way other people can reproduce his tests and back him up. Thus making his data more credible. Everyone else does this because it's extremely important to the validity of your claim, I don't know why he doesn't.

Off-topic rant: (Show Spoiler)
Regarding that video, just looking at for example Far Cry 3 results you can clearly see something is wrong with his intel system.
I have i5 2400 and gtx 660 and on MAX settings I avarage 30 fps with few frops to 27-28 on 1080p resolution.
So how can he get 25 fps with 3570k and 670 ??? He should get more then me, which means more then 27 fps.
Yet he got 25 fps.
So conclusion is clear, there was something wrong with his intel system, and nothing wrong with AMD, and that explains his results

EDIT: Here is the 30+ page topic on ocn.
http://www.overclock.net/t/1353440/teksyndicate-amd-fx-8350-oc-vs-i5-3570k-oc-using-an-evga-gtx-670
They came to same conclusion as me, that something was wrong with his intel system, most probably itx mobo he used
@rpglord

He could be testing a different area and/or using different settings. Since he doesn't list anything we just don't know. I agree that he needs to go back and redo it properly if he wants anyone to take him seriously.
But all that aside, did you guys notice one VERY important thing?

1.) It was able to run every game at max settings with playable graphics
2.) It was cheaper than the intel

So what's the big deal Smile

Whether or not he was using the intel incorrectly, he was at least using the AMD correctly, and showed that in real life situations the FX 8350 is good enough to run PC games on max settings at 1080p with playable FPS, and it's cheaper than the intel. That is why I got it, I didn't see why I would need a faster CPU, when I don't have to, I'm not a benchmarker, if there is a FPS cap, there will be no difference when playing a game Smile
Most cpus can achieve high fps with virtually any modern game. Even horribly outdated ones. As I mentioned earlier the real question is can they maintain that framerate even when the game puts them under load?

CPU load in games is not consistent (unlike nearly every other application). It varies through different segments of the game depending on what's going on. A good cpu is one that does not experience temporary framerate drops (stuttering or microstuttering) when something that demands high cpu throughput happens (sudden swarm of AI enemies, explosions with collision physics, etc.). We measure these drops through framtimes not framerates. The reason being is that occasional large drops in framerate won't change the average framerate by more than a few fps. Thus the cpu will still seem like it's performing well even when it's not if we go by just framerate measurements. Most of the time cpu load is very low and the bottleneck is on the gpu side. Which is why framerates in modern games are based so heavily on your GPU performance and cpus that are massively faster/slower don't affect average framerate very much. This is why most major hardware review organizations have started including frametime measurements in their cpu benchmarks. Also because we've only recently had the software/hardware necessary to take accurate frametime measurements. Up until a few years ago it was virtually non-existent.

If you look at any frametime measurements out there comparing the FX 8350 against a modern i5 or i7 its performance ranges from slightly worse to a lot worse depending on the game. Now whether the performance is "good enough" for you depends on your standards and which game you're comparing. But regardless of this the Intel platforms are objectively faster for PC games. Which is why people recommend them for PC gaming. They cost about the same too. Your can get an i5 3470 for the same price and an i5 3570 for slightly more.

The bottom line is that for high budget rigs doing PC gaming and emulation your really want to stick with an Intel cpu if you can. You clearly run some applications that run faster on the FX 8350 but PC games and emulators are not going to be among them. Also my argument never had anything to do with whether it was "good enough".

And for the record there are a number of other "little things" wrong with the video in question that I could nitpick about. But I don't want to do so as it would be a waste of time and would draw attention away from my main argument. And I've already done enough of that in my spoiler tags.

Edit: Here is a recent article anandtech did on 1440p gaming on haswell vs. vishera for anyone who might be curious: http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-/6
They reached the same conclusion as tomshardware with similar data. Xbitlabs and some other organizations did some too but I'm too lazy to look them up right now.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14