Dolphin, the GameCube and Wii emulator - Forums

Full Version: Metroid Prime emulation help
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4
Venomx1 Wrote:This. i went from a 1280x1024 native monitor to 1920x1080. its A HUGEEEEEE Difference. its like getting a gaming mouse. you never want to go back to regular crappy ones ever again.

I as well recently upgraded from a 1280x1024 monitor to a 1920x1080 one. While I'm still not a huge fan of 16:9 (I would have preferred to have a bit more vertical resolution like 1920x1200) I can testify that it's pretty much worth every dollar I spent. It takes some time to adjust to how big things are (increasing your fonts helps massively when switching btw), but now I'm pretty comfortable with it. You'll enjoy the difference in fullscreen mode.
Cruzar Wrote:Eh' 1024x768 looks just fine to me, as far as I can tell 1080p seems over-rated,

[Image: 3q1l0b.jpg]

This is just your subconscious trying to rationalize that it's not that important because you don't have it. A high screen resolution is so crucial to improving your user experience on a computer that there is simply no other explanation for this statement. Any tech company knows that it is without question one of the most important things you can do to boost productivity.

Cruzar Wrote:plus I'd rather 1024x768 (plan to upgrade some-day to 18.5inch and 1366x768) with 60fps on High than play my PC games in 1080p with 30-40fps on Low. Plus I plan to get newer games eventually, and newer GPU's that stomp this have problems with them in 1080p so yeah.

This is poor logic. If you want to game at a lower resolution to improve your framerate then lower your resolution in the game. This doesn't mean that you should be forced to use a lower resolution all the time, even when making documents or browsing the web. There is literally no benefit to using that outdated monitor.

Or better yet lower your settings since resolution is way more important.

Or upgrade your hardware if you have the money.

Any of those three options will work.

Cruzar Wrote:Nah' way too expensive for a size where such a high-resolution makes sense.

It makes sense for any screen above 10". Your eyes will thank you.

Even at 15" at a normal viewing distance the difference is night and day.

And used 1080p monitors are dirt cheap. I see brand new 24" 1080p monitors on sale for $150 all the time from reputable brands. I can only imagine how cheap they are used.

Cruzar Wrote:Plus like I said, don't care. I'm not Interested in 1080p, I have no need for that much desktop space, 1280x1024 is the most I'd need,

You might not "need" it to survive but you're purposefully subjecting yourself to something that is painfully inefficient for no good reason. After getting used to 1920 x 1200 my laptop screens (1366 x 768 and 1024 x 768) are extremely frustrating to use.

Switch your desktop to 640 x 480 and try using it for a week. Yeah, that's how we feel about using 1024 x 768 after getting used to 1080p.

Cruzar Wrote:plus I refuse to sacrifice Texture quality/etc, for "OMG ISH 1080p HD! (bandwagon)", to me High-Settings in 1024x768 - 1366-768 or 1280x1024 look much crisper than 1080p running with Low-Fi settings.

That's fine, you don't have to. That still doesn't mean that you should be using 1024 x 768 all the time.

Although that is a very bad opinion to have and I doubt you would still hold it after owning one for awhile. Most modern AAA PC games (especially crossplatform ones) show very little difference in image quality between medium/high/ultra settings (you can thank consoles for this) whereas changing the resolution has an enormous effect on image quality. Everything is far clearer due to more information being available. Once again try my above example. Play all of your games at 640 x 480 for a week. That's exactly what it feels like to go from 1920 x 1080 to 1024 x 768 (and mathematically has about the same reduction in information frequency). Then you should understand why this is important.

Also the word "crisper" is synonymous with "sharper" in computer graphics and would not make sense as a word to describe lower resolution content.

And higher image quality is not a bandwagon. That's ridiculous. Your justifications are crazy. Stop trying to convince yourself that it isn't worth it and that everyone else that has tried it is just insane or stupid. It's far more likely that they're right.
(05-28-2013, 02:35 PM)NaturalViolence Wrote: [ -> ]
Cruzar Wrote:Eh' 1024x768 looks just fine to me, as far as I can tell 1080p seems over-rated,

[Image: 3q1l0b.jpg]

This is just your subconscious trying to rationalize that it's not that important because you don't have it. A high screen resolution is so crucial to improving your user experience on a computer that there is simply no other explanation for this statement. Any tech company knows that it is without question one of the most important things you can do to boost productivity.

Cruzar Wrote:plus I'd rather 1024x768 (plan to upgrade some-day to 18.5inch and 1366x768) with 60fps on High than play my PC games in 1080p with 30-40fps on Low. Plus I plan to get newer games eventually, and newer GPU's that stomp this have problems with them in 1080p so yeah.

This is poor logic. If you want to game at a lower resolution to improve your framerate then lower your resolution in the game. This doesn't mean that you should be forced to use a lower resolution all the time, even when making documents or browsing the web. There is literally no benefit to using that outdated monitor.

Or better yet lower your settings since resolution is way more important.

Or upgrade your hardware if you have the money.

Any of those three options will work.

Cruzar Wrote:Nah' way too expensive for a size where such a high-resolution makes sense.

It makes sense for any screen above 10". Your eyes will thank you.

Even at 15" at a normal viewing distance the difference is night and day.

And used 1080p monitors are dirt cheap. I see brand new 24" 1080p monitors on sale for $150 all the time from reputable brands. I can only imagine how cheap they are used.

Cruzar Wrote:Plus like I said, don't care. I'm not Interested in 1080p, I have no need for that much desktop space, 1280x1024 is the most I'd need,

You might not "need" it to survive but you're purposefully subjecting yourself to something that is painfully inefficient for no good reason. After getting used to 1920 x 1200 my laptop screens (1366 x 768 and 1024 x 768) are extremely frustrating to use.

Switch your desktop to 640 x 480 and try using it for a week. Yeah, that's how we feel about using 1024 x 768 after getting used to 1080p.

Cruzar Wrote:plus I refuse to sacrifice Texture quality/etc, for "OMG ISH 1080p HD! (bandwagon)", to me High-Settings in 1024x768 - 1366-768 or 1280x1024 look much crisper than 1080p running with Low-Fi settings.

That's fine, you don't have to. That still doesn't mean that you should be using 1024 x 768 all the time.

Although that is a very bad opinion to have and I doubt you would still hold it after owning one for awhile. Most modern AAA PC games (especially crossplatform ones) show very little difference in image quality between medium/high/ultra settings (you can thank consoles for this) whereas changing the resolution has an enormous effect on image quality. Everything is far clearer due to more information being available. Once again try my above example. Play all of your games at 640 x 480 for a week. That's exactly what it feels like to go from 1920 x 1080 to 1024 x 768 (and mathematically has about the same reduction in information frequency). Then you should understand why this is important.

Also the word "crisper" is synonymous with "sharper" in computer graphics and would not make sense as a word to describe lower resolution content.

And higher image quality is not a bandwagon. That's ridiculous. Your justifications are crazy. Stop trying to convince yourself that it isn't worth it and that everyone else that has tried it is just insane or stupid. It's far more likely that they're right.
"hurr durr am leet I know everything, resolution means everything, im so smurt hurr hurr your stupid get a 1080p display and game at 1080p, hurr hurr blah blah blah im talkin out mein ass" - you

I regret coming to these forums.
It's like going to a Microsoft forum for help with Linux.
Or 4chan where everyone will shove Gentoo in your face.

(05-27-2013, 09:45 PM)ExtremeDude2 Wrote: [ -> ]Then get a 1080p monitor, but don't run it at 1080p :3
I've seen what stuff looks like on a display when below the displays native res, including a 1080p display. . . .that'd be retarded and I refuse to look at such a fuzzy mess.
If I go 1080p it'll be when I get a better card that can run my Games (not just Gmod, and not just Dolphin I want to into some Elder Scrolls and DMC not to mention F.E.AR. 2 and 3 when I can.)

PS: I haven't the cash to afford a card that can handle 1080p on such games anyway, I will WAIT and then maybe in the future, if I truly ever end up feeling I need it but I don't, I have zero need for that much space not to mention my eye problems (near-sighted) make viewing beyond 1280x1024 frustrating because of how much smaller everything is, and last I checked there was only one way to adjust font size and I've never seen it work really well.
Cruzar Wrote:"hurr durr am leet I know everything, resolution means everything, im so smurt hurr hurr your stupid get a 1080p display and game at 1080p, hurr hurr blah blah blah im talkin out mein ass" - you

I regret coming to these forums.
It's like going to a Microsoft forum for help with Linux.
Or 4chan where everyone will shove Gentoo in your face.

That's not a very good retort to any of my points.

It's generally a better idea to criticize the things I say rather than attack me or the forums personally. You're quick to compare this place to 4chan but that is what people on 4chan do. They certainly do not criticize anything in the manner I just did. They attack and ridicule people in a childish manner in order to avoid long drawn out discussions where they have to defend their opinions from people with different opinions.

You most likely hate the forums because you hate me. And you most likely hate me because I'm disagreeing with you.

I don't see how I was being an asshat other than maybe the image at the beginning. Which I debated putting in since it is unprofessional (although very amusing). Was I critical? Yes. But I don't see how that means I was "talking out of my ass". I felt that I provided sufficient elaboration on all of my points.

Also what does any of this have to do with linux? I'm a bit confused on that point.

Cruzar Wrote:I've seen what stuff looks like on a display when below the displays native res, including a 1080p display. . . .that'd be retarded and I refuse to look at such a fuzzy mess.

I admit it will look slightly worse when running at a low resolution. The fact that your monitor is so small helps a bit in that regard. But using a low resolution is going to look like a fuzzy mess either way. And with a low resolution monitor you lack the potential to make things look better by raising the resolution in any case where you can (even if it's just for browsing the web).

Cruzar Wrote:If I go 1080p it'll be when I get a better card that can run my Games (not just Gmod, and not just Dolphin I want to into some Elder Scrolls and DMC not to mention F.E.AR. 2 and 3 when I can.)

The question is will you have moved on to more demanding games by then?

Because if that's the case then you'll never be able to keep up.

Also I'm surprised that your current card can't handle all of those at 1080p 60 fps. My old 8800GT performed similarly and was able to do all of that fine (minus DMC since I haven't played that). Also which elder scrolls game are you referring to?

Cruzar Wrote:PS: I haven't the cash to afford a card that can handle 1080p on such games anyway, I will WAIT and then maybe in the future, if I truly ever end up feeling I need it but I don't,

I suppose this depends on what games you plan to play.

Cruzar Wrote:I have zero need for that much space not to mention my eye problems (near-sighted) make viewing beyond 1280x1024 frustrating because of how much smaller everything is,

If you have vision problems then having a larger monitor will benefit you even more than someone like me with perfect vision. A 24" monitor is much large than the 15" monitor that you're using now. I recently gave my grandfather one of my old 22" monitors to replace his old 15" crt for that very reason.

Cruzar Wrote:and last I checked there was only one way to adjust font size and I've never seen it work really well.

You should be using the Windows DPI and font size settings for that. Not the screen resolution.
Okie dokey
And yeah, when I get the chance I plan to throw in a GTX 480 or something along the lines of an HD 5850, or 6870 but that's dreamland level stuff for me and yes I know they're outdated and weaker than the 600/7000 series. lol

Yeah, a bigger screen would help but one that's actually decent (no fake specs, goofy lighting that doesn't quite work, etc) at that size much less 1080p is way out of range currently.

For now, I'll upgrade the Processor, and then later when I have the cash and feel it necessary I'll grab a better display, and of course if I ever have the cash a better video card.

Oh PS: I plan to also eventually into Black Ops and Black Ops II, no I'm not a CoD fanboy and plan to avoid the multi-player, but I found the single player campaign in the "Blops" series to be somewhat okay, cept last time I played blops was on my cousins wii and blops II was on his PS3 with them fuzzy shadows from the lazy PS3 devs at E.A.
Whaever floats your boat

i also play at 1024x768 on my old samsung crt (or 1600x1200 depending on the game) but i`m really looking forward to a 16:9 LED monitor
@omega_rugal
1024 x 768 on a crt can look pretty good as long as it's a high end model that uses an aperture grille. What's the screen size and what's the model?

Cruzar Wrote:Okie dokey
And yeah, when I get the chance I plan to throw in a GTX 480 or something along the lines of an HD 5850, or 6870 but that's dreamland level stuff for me and yes I know they're outdated and weaker than the 600/7000 series. lol

If you can get a used one for cheap by all means go for it. That's what I usually recommend friends that are on a tight budget to do.

Cruzar Wrote:no fake specs, goofy lighting that doesn't quite work, etc

By "goofy lighting" I assume you're referring to backlight/luminance uniformity?

Cruzar Wrote:on his PS3 with them fuzzy shadows from the lazy PS3 devs at E.A.

You have got to stop assuming that every software flaw is due to laziness.

I don't know much about the engine that CoD uses or how it handles shadow rendering but I would imagine that this is probably due to a hardware limitation. The more limited memory space of the ps3 could be forcing them to use lower resolution shadow maps since the larger ones won't fit. Or they might have had to change the shadow filtering due to a limitation with the gpu pipeline/API. That happens all the time.

Of course by fuzzy I'm assuming you're referring to lower resolution shadows. Without some screenshots I can't really judge. A quick google search yielded nothing on my end.
Back on topic:

I think some people are overestimating this first Metroid game. I was able to run the game at full speed with my current CPU at 2.6GHz. Having said that, I was only using EFB to RAM when needed, and switching to EFB to Texture for the majority of the game. I will concede that I have not played through the entire game, but I was not having any issues as far as I could see early on. Saying that you would need an Ivy Bridge overclocked seems a bit excessive. I'm quite aware of which games need a great CPU overclocked to run, but this game does not seem like one of them.

If anyone has any doubts, feel free to post an NTSC memory card save and I will be happy to run it and post my results.
(06-04-2013, 02:23 PM)NaturalViolence Wrote: [ -> ]@omega_rugal
1024 x 768 on a crt can look pretty good as long as it's a high end model that uses an aperture grille. What's the screen size and what's the model?

Cruzar Wrote:Okie dokey
And yeah, when I get the chance I plan to throw in a GTX 480 or something along the lines of an HD 5850, or 6870 but that's dreamland level stuff for me and yes I know they're outdated and weaker than the 600/7000 series. lol

If you can get a used one for cheap by all means go for it. That's what I usually recommend friends that are on a tight budget to do.

Will do.
Cruzar Wrote:no fake specs, goofy lighting that doesn't quite work, etc

By "goofy lighting" I assume you're referring to backlight/luminance uniformity?

That, and well cheap-shit lighting had a friend who once bought a HDTV because it was 50$ on sale. . . .the lighting was terrible, not to mention the pixels were all half-dead and the replacement they "sent him" out of warranty did the same thing.
Cruzar Wrote:on his PS3 with them fuzzy shadows from the lazy PS3 devs at E.A.

You have got to stop assuming that every software flaw is due to laziness.

I don't know much about the engine that CoD uses or how it handles shadow rendering but I would imagine that this is probably due to a hardware limitation. The more limited memory space of the ps3 could be forcing them to use lower resolution shadow maps since the larger ones won't fit. Or they might have had to change the shadow filtering due to a limitation with the gpu pipeline/API. That happens all the time.

Of course by fuzzy I'm assuming you're referring to lower resolution shadows. Without some screenshots I can't really judge. A quick google search yielded nothing on my end.
Actually, I'm pretty positive it's lazyness.
There are games with shadowing that looks just fine on PS3 the only issue is it's only devs that sony owns that have done it right, everyone else saved the goods for 360, what some devs have done is taking whatever won't fit in ram, typically just textures and streamed them from the Hard-Drive they could have easily done that for instance plus used the Cell processor for Graphics (which is what Uncharted 2 and 3 have done) and ditched the shitty crippled nvidia chip altogether.

Though now that I think about it, Sony should have added extra ram to PS3 rather than just making a not so very needed "Slim-Line 2" model. In-fact I think that's what all the big "Console" companies should do, including nintendo. I don't mean tossing in 128gb ram or some crazy stuff, like 2GB? Well toss in another 2GB then. it's not that expensive.
(06-04-2013, 03:14 PM)Cruzar Wrote: [ -> ]Actually, I'm pretty positive it's lazyness.
There are games with shadowing that looks just fine on PS3 the only issue is it's only devs that sony owns that have done it right, everyone else saved the goods for 360, what some devs have done is taking whatever won't fit in ram, typically just textures and streamed them from the Hard-Drive they could have easily done that for instance plus used the Cell processor for Graphics (which is what Uncharted 2 and 3 have done) and ditched the shitty crippled nvidia chip altogether.

Though now that I think about it, Sony should have added extra ram to PS3 rather than just making a not so very needed "Slim-Line 2" model. In-fact I think that's what all the big "Console" companies should do, including nintendo. I don't mean tossing in 128gb ram or some crazy stuff, like 2GB? Well toss in another 2GB then. it's not that expensive.
Please give me some actual reasons. I promise you these consoles aren't godlike. They are pitiful by today's standards.

As far as graphics goes, most xbox games use 1024*600 while ps3 games use 960*540. There is hardly a difference there, especially when you factor in anti aliasing, you can't tell a difference. Both will be blurry on a 1080p screen. No game is run at a 1080p resolution, and most don't run at 720. Also games don't "stream" textures from the hard drive. Loading multiple textures and "streaming" them would cause a lot of stuttering. Textures are loaded into ram, then loaded from ram because ram is fast. The main reason textures in games are blurry is because both systems have 512 megabytes of shared ram which isn't much at all so they have to compensate by using smaller/blurrier/compressed textures. If either console had added half a gig or more that would have helped alot. Furthermore, uncharted did not use the cpu to render graphics. Nor would using that help. That is called software rendering, it is slow as balls. If you want to try it, dolphin has it, and I can assume you have the emulator. Let me know how fast it runs for you. That "shitty" nvidia chip is equivalent to a gt 7900, which isn't a great chip, but for what the developers have made using it are pretty good. I'd call it an achievement. I can say your opinion about the ram is valid, I actually stated it in my post adding more would have helped. But the slim lite editions lacked the ps2 emotion engine cpu. That's why you couldn't play ps2 games after the first ps3s. They took them out to reduce prices. Sony lost money on every ps3 sold and still are. Just it isn't as bad. It is 2 am and I had written this post very neatly. Then chrome crashed, so here is this paragraph and I hope I re-elaborated enough for you to understand. If not, I'm sure NV will come on sometime and write more. He can give you in a lot more detail about why what you said wouldn't work and explain it far better than I can.
Pages: 1 2 3 4