• Login
  • Register
  • Dolphin Forums
  • Home
  • FAQ
  • Download
  • Wiki
  • Code


Dolphin, the GameCube and Wii emulator - Forums › Dolphin Emulator Discussion and Support › Hardware v
« Previous 1 ... 134 135 136 137 138 ... 189 Next »

Metroid Prime emulation help
View New Posts | View Today's Posts

Pages (4): « Previous 1 2 3 4
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Thread Modes
Metroid Prime emulation help
06-04-2013, 05:28 PM (This post was last modified: 06-04-2013, 06:16 PM by NaturalViolence.)
#31
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
Starscream Wrote:Back on topic:

I think some people are overestimating this first Metroid game. I was able to run the game at full speed with my current CPU at 2.6GHz. Having said that, I was only using EFB to RAM when needed, and switching to EFB to Texture for the majority of the game. I will concede that I have not played through the entire game, but I was not having any issues as far as I could see early on. Saying that you would need an Ivy Bridge overclocked seems a bit excessive. I'm quite aware of which games need a great CPU overclocked to run, but this game does not seem like one of them.

If anyone has any doubts, feel free to post an NTSC memory card save and I will be happy to run it and post my results.

For me it' was a big hit or miss on my old rig. Since some of the visors don't work properly with efb copy to texture I had to leave efb copy to ram on all the time later in the game. It ran at fullspeed in a lot of places and the audio had only a few minor bugs with HLE. However in a lot of places it did not run well. I did actually play through the entire game. The random crashes (which it still encounters to my knowledge) were extremely annoying and seemed to become more frequent later on.

The worst spot I found was the boss fight with flaghra (or however you spell it, the giant plant monster). On my Q6600 @ 3.2GHz this ran at 20 fps with HLE audio. Which actually made the fight extremely easy since at 1/3 speed predicting and dodging his moves is even easier. The second worst was the first room in the crash pirate frigate zone (the big room at the top). I noticed that generally it runs extremely slow in big outdoor areas. Another example of this is the outside of the pirate frigate in the intro.

@garrlker

Most of what you posted is wrong or lacking context. You also didn't really answer his question. I'll respond to it tomorrow.

I do have time to respond to cruzar though since his post is much shorter.

Cruzar Wrote:Actually, I'm pretty positive it's lazyness.
There are games with shadowing that looks just fine on PS3 the only issue is it's only devs that sony owns that have done it right, everyone else saved the goods for 360, what some devs have done is taking whatever won't fit in ram, typically just textures and streamed them from the Hard-Drive they could have easily done that for instance

The vast majority of xbox360 and ps3 games do what you just described. It's called texture streaming. I can assure you that the engine IW uses for all CoD games since CoD4 uses this technique on both the xbox360 and ps3 versions of the game. The ps3 still has less usable memory. Texture streaming can't fix that.

Shadow maps cannot be streamed. They are not static, they are dynamic. So this doesn't really relate to this problem at all.

You really need to read more about how 3D graphics are rendered before you start passing judgement on developers like this. Also the term you're looking for is "shadow mapping" not "shadowing". Shadowing is the act of following someone.

Cruzar Wrote:plus used the Cell processor for Graphics (which is what Uncharted 2 and 3 have done) and ditched the shitty crippled nvidia chip altogether.

I may not have a ps3. I may not have made any ps3 applications. I may not follow ps3 game development much if at all. But I know this is completely wrong because I have read some of naughty dogs whitepages because of my interest in 3D graphics. They did not "ditch the crippled nvidia chip". In fact it does most of the rendering work since it's much faster than the cell be SPEs at most 3D rendering tasks. Their engine uses a custom pipeline for 3D rendering with a mixture of hardware accelerated (GPU) and software (cell be) rendering stages. The SPEs on the cell are used mostly for the transformations if I recall. They also do some minor post-processing with the SPEs (DoF and AA are both implemented cpu side, I'm not sure what else).

Cruzar Wrote:Though now that I think about it, Sony should have added extra ram to PS3 rather than just making a not so very needed "Slim-Line 2" model. In-fact I think that's what all the big "Console" companies should do, including nintendo. I don't mean tossing in 128gb ram or some crazy stuff, like 2GB? Well toss in another 2GB then. it's not that expensive.

This is a terrible idea for several reasons.

1. It's more expensive than you might realize. Especially since the ps3 uses xdr ram for main memory which requires a license from rambus to produce.
2. It makes the console bigger if you don't wait for memory densities to come down.
3. It changes the memory space which forces all software tools to in turn be changed. At this point you might as well upgrade the rest of your hardware anyways since you'll have to redo everything either way.
4. It completely defeats the point of a console by fracturing the software and hardware forcing the devs to make different versions of the product. You've effectively created a whole new console/platform. The PS3.5. This was the primary cause of the video game industry crash of 1983, extremely fractured hardware and software sucked any profitability out of both sides of the industry. Sega also did this too much in the 90s and it ruined them.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
06-05-2013, 04:55 AM
#32
Runadumb Offline
Junior Member
**
Posts: 14
Threads: 0
Joined: Nov 2012
(06-04-2013, 02:46 PM)Starscream Wrote: Back on topic:

I think some people are overestimating this first Metroid game. I was able to run the game at full speed with my current CPU at 2.6GHz. Having said that, I was only using EFB to RAM when needed, and switching to EFB to Texture for the majority of the game. I will concede that I have not played through the entire game, but I was not having any issues as far as I could see early on. Saying that you would need an Ivy Bridge overclocked seems a bit excessive. I'm quite aware of which games need a great CPU overclocked to run, but this game does not seem like one of them.

If anyone has any doubts, feel free to post an NTSC memory card save and I will be happy to run it and post my results.

I don't know about that. I played through the whole of Metroid 1 on my overclocked i7 920 with 2x570GTX's and the framerate TANKED in places but was all over the place most of the time.
I maybe didn't have it optimally set up but it is certainly the most demanding GC game I have tried.

There is a spot early on in the game which gave me heavy slowdown. Its the place where you drop off all the artefacts. That would be the place to start messing around with settings and framerates.
Find
Reply
06-05-2013, 08:48 AM
#33
garrlker Offline
That one guy
***
Posts: 183
Threads: 7
Joined: Feb 2012
@NV Yeah, I just googled to see how I got texture streaming wrong. What it is, and what I thought he was talking about was completely off. That is my bad. And sorry, it was 2-3 am and I'm sure your post is going to be a huge list of what I got wrong lol. Just go easy.
On Topic, I haven't tried metroid on my new rig but on my old rig (core 2 duo @ 2.8 ghz) it didn't run well at all. I was at 20-40 frames usually and that was with HLE audio.
Find
Reply
06-05-2013, 08:57 AM
#34
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
garrlker Wrote:Please give me some actual reasons. I promise you these consoles aren't godlike.

He never said they were.

garrlker Wrote:As far as graphics goes, most xbox games use 1024*600 while ps3 games use 960*540.

The resolutions vary pretty widely from game to game. I would say the average is a bit higher than that. Somewhere between 640p and 720p for the xbox360 and 600p and 720p for the ps3.

garrlker Wrote:There is hardly a difference there, especially when you factor in anti aliasing, you can't tell a difference.

Speak for yourself. I sure as hell can.

garrlker Wrote:Both will be blurry on a 1080p screen.

True.

garrlker Wrote:No game is run at a 1080p resolution,

This is wrong. Very few games render at 1080p but there are a few on the ps3.

garrlker Wrote:and most don't run at 720.

True. Most run at slightly below 720p.

garrlker Wrote:Also games don't "stream" textures from the hard drive.

Um.....yes they do.

garrlker Wrote:Loading multiple textures and "streaming" them would cause a lot of stuttering.

It sometimes does. It depends how it's implemented. Typically low resolution versions are loaded with the rest of level assets while higher resolution versions are streamed into memory as they are needed. The more common side effect of this is "texture pop in".

garrlker Wrote:Textures are loaded into ram, then loaded from ram because ram is fast.

Loaded from ram to where? Loading is the process of copying data from the HDD into ram.

garrlker Wrote:The main reason textures in games are blurry is because both systems have 512 megabytes of shared ram which isn't much at all so they have to compensate by using smaller/blurrier/compressed textures.
garrlker Wrote:If either console had added half a gig or more that would have helped alot.


Correct.

garrlker Wrote:Furthermore, uncharted did not use the cpu to render graphics. Nor would using that help.

Yes it does. And yes that would help.

garrlker Wrote:That is called software rendering, it is slow as balls. If you want to try it, dolphin has it, and I can assume you have the emulator. Let me know how fast it runs for you.

It's slow as balls for two reasons:
1. It's extremely poorly optimized according to neobrain himself. He hasn't had much time to work on it.
2. x86 cpus are very poorly designed for this type of workload. Believe it or not piledriver would be the best microarchitecture for this type of workload at the moment.

To give a quick and dirty summary the cell be (broadband engine) is basically like a single slow "regular" cpu core with 8 "gpu like" cores on one chip. The PPE is similar to a general purpose cpu and the SPEs are highly specialized simd processors with limited instruction sets, fast I/O, big register files, etc. just like a GPU. They're more programmable than a normal GPU from that era though which is highly advantageous to software developers who want to get their hands dirty. The idea that sony's engineers had was no doubt to offer the flexibility of software rendering (CPU) with the performance of hardware rendering (GPU) in a single chip. They tried to do this with the ps2 as well, much more successfully I might add. I'll elaborate more on this in the future if I have time.

garrlker Wrote:That "shitty" nvidia chip is equivalent to a gt 7900, which isn't a great chip, but for what the developers have made using it are pretty good. I'd call it an achievement.

Depends on which spec. you're looking at. I would say that's a fairly reasonable equivalent though.

garrlker Wrote:I can say your opinion about the ram is valid, I actually stated it in my post adding more would have helped. But the slim lite editions lacked the ps2 emotion engine cpu. That's why you couldn't play ps2 games after the first ps3s. They took them out to reduce prices.

They took that out long before the slim or super slim editions. Only the very first generation of ps3 consoles (which were only sold in the US) had them.

Also I don't know why you're bringing that up.

garrlker Wrote:Sony lost money on every ps3 sold and still are. Just it isn't as bad.

I'm pretty sure that's no longer true.

garrlker Wrote:It is 2 am and I had written this post very neatly. Then chrome crashed, so here is this paragraph and I hope I re-elaborated enough for you to understand.

I feel your pain. This has happened to me a few times. It's never quite as good as the original.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
06-05-2013, 04:05 PM
#35
omega_rugal Offline
A thorn on your side
***
Posts: 137
Threads: 6
Joined: Mar 2013
It`s a samsung syncmaster 997 mb 19 inches

http://reviews.cnet.com/crt-monitors/samsung-syncmaster-997mb-crt/4505-3175_7-31445664.html

it got bad reviews? weird, i haven`t seen any of the mentioned problems, besides, why are they dated 2006? i got it in 2004, was that a diferent model or what?
Find
Reply
06-09-2013, 09:19 PM (This post was last modified: 06-09-2013, 09:21 PM by Cruzar.)
#36
Cruzar Offline
Junior Member
**
Posts: 13
Threads: 1
Joined: May 2013
That Interesting.
Sorry I haven't checked back on this thread, been busy. . .well if you can call it busy.

I'm thinking since I won't have the cash I'll just nab an i3 (sandybridge) instead for now and Enjoy a good (okay from what I've read massive, sometimes around or faster than a Phenom II X6 sometimes just slightly slower than an i5 even (excluding an i5 at 4+ghz) boost in my PC Games themselves for now, and hold off on a i5/i7 for when I have an actual Job.

Or just forget Dolphin emulation and jump on the AMD wagon, after witnessing even Bulldozer touching an IVybridge i7 several times in real-world testing with games (Demolishing their older Phenom II counterparts pretty damn badly) I would love to get my hands on and play, which pretty much shatters the whole "Synthetic benchmark" thing for me, especially upon learning of the cinebench scandal that still continues to go on to this day and is done by other softwares too such as "Batman Arkham" series. :I

On a side note, there are PSX and N64 games that have shadows, and they aren't that pixelated and blocky so. . .whaddafuq?
Find
Reply
06-10-2013, 05:27 AM (This post was last modified: 06-10-2013, 05:28 AM by NaturalViolence.)
#37
NaturalViolence Offline
It's not that I hate people, I just hate stupid people
*******
Posts: 9,013
Threads: 24
Joined: Oct 2009
Cruzar Wrote:Or just forget Dolphin emulation and jump on the AMD wagon, after witnessing even Bulldozer touching an IVybridge i7 several times in real-world testing with games (Demolishing their older Phenom II counterparts pretty damn badly) I would love to get my hands on and play, which pretty much shatters the whole "Synthetic benchmark" thing for me,

You would be wasting a lot of money as a PC gamer. The FX series cpus get their asses handed to them in virtually every game benchmark both in framerate and frame times (frame latency). Even being outperformed by phenom II in some games. Bulldozer and piledriver are inherently bad microarchitectures for the type of workload that your typical video game does.

Not to mention dolphin and pcsx2 are going to run a lot better on a sandy bridge, ivy bridge, or haswell cpu.

Cruzar Wrote:especially upon learning of the cinebench scandal that still continues to go on to this day and is done by other softwares too such as "Batman Arkham" series.

What scandal? I haven't heard of either of these scandals or managed to find anything about them from a google search.

Cruzar Wrote:On a side note, there are PSX and N64 games that have shadows, and they aren't that pixelated and blocky so. . .whaddafuq?

Well they won't look so nice when rendered at native resolution but when rendered at a high resolution through an emulator sure. There could be a number of reasons for this. It could be that they use static shadows. Or that they used fixed function rendering. Or the fact that they use shadows very minimally and use very unrealistic algorithms that don't account for many factors that effect shadow color, sharpness, shape, etc. Likely all of the above.

The point is that realistic dynamic shadow rendering through pixel shaders is very demanding. To reduce the performance hit they either have lower the draw distances so less objects have shadows being drawn, make more of the shadows static, or lower the shadow map resolution resulting in blocky shadows. They apply some combination of the above three techniques to reduce the performance hit to an acceptable level.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."  
-Ron Swanson

"I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. "
-Mark Antony
Website Find
Reply
06-10-2013, 08:20 PM
#38
Cruzar Offline
Junior Member
**
Posts: 13
Threads: 1
Joined: May 2013
That's quite and Interesting claim. (Sarcasm)
Tek Syndicate, for instance (not just them others too though hard to find what with most sites running benches/tests pretty much owned by Intel) did tests of their own and the FX only occasionally looses to Intel in real world scenarios an 8350 can sometimes even keep up with a 3820.

Also the cinebench scandal was a scandal involving cinebench and the Intel compiler they (and other synthetic benchmarks writers) used, of which when detecting an Intel chip did it the most optimized way possible, but if it saw an AMD chip, it did it the worst way possible making sure even though your processor clearly can do and is doing much better than the test so claims, that you think your processor is slow as shit and worthless and that you must OMG buy Intel, buy Intel, when patched to make it think you're always on Intel, AMD chips had a massive performance jump.

Sort of like that nvidia driver scandal/hack they did with the Geforce FX series ages ago, only if they had modified 3dmark to run slower on ATI cards.

As for the whole shadow mapping thing, that's just retarded.
I'd rather have a simple static shadow that pretended to be dynamic, than a ugly piece of poo.
That's like if they decided to use Bilinear filtering, and 2x2 texture sizes (then stretching them out) all of the do not want.
Find
Reply
« Next Oldest | Next Newest »
Pages (4): « Previous 1 2 3 4


  • View a Printable Version
  • Subscribe to this thread
Forum Jump:


Users browsing this thread: 1 Guest(s)



Powered By MyBB | Theme by Fragma

Linear Mode
Threaded Mode