Dolphin, the GameCube and Wii emulator - Forums

Full Version: Oculus Rift support
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13
The Oculus Rift is the new super-high Field Of View head mounted display by Palmer Luckey and promoted by John Carmack of iD software. They had a very successful Kickstarter launch and lots of preorders, and there's a lot of hype about it and a lot of interested developers. It's in 3D, has a 90 degree horizontal FOV, and a 110 degree vertical FOV, 640x800 resolution per eye (by using a 1280x800 screen split down the middle), and a 3DOF head-tracker with very low latency. The optics warp the image to make more pixels near the center and less towards the edges and corners.

It hasn't been released yet, but we know most of the technical details, and some people on the MTBS3D forums have already made their own DIY versions.

It's currently supported on PC only, but lots of people on the Oculus Rift forums wanted to use it with consoles, particularly with Nintendo Wii games. Maybe because of the Nintendo On hoax, or maybe just because of the Wiimote's motion controls. Anyway, I figured the best way to do this is with Dolphin.

I want to add Oculus Rift support to Dolphin. Does anyone want to help? It would involve the following:
support for 1280x800 resolution (surely done already);
rendering in stereoscopic 3D (I think other people may have already done this?);
rendering side-by-side but not squashed;
detecting which game it is and allowing custom VR settings per game;
rendering with a larger vertical FOV than the game was designed for, or rendering letterboxed, or rendering cropped horizontally (depending on the game and user preferences);
optionally rendering with the correct Rift FOV rather than whatever the game normally uses, if possible;
using a pixel/fragment shader to warp the image to match the optics;
faking Wiimote/nunchuk/classic controller input for head-turning in response to head-tracking;
rolling the in-game camera in response to head roll tracking;
optionally rotating the in-game camera's yaw and/or pitch in response to head-tracking;
allowing the use of Wii Motion Plus gyros instead of a sensor bar (is this already supported?) so they don't have to face the TV;
possibly allow remapping some controls so the user can recenter the view/cursor using the just the Wiimote and Nunchuk or classic controller (since they can't see the keyboard through the HMD);
and of course interfacing with the Oculus Rift SDK (when it's released)

Did I forget anything? We don't have to implement all these things of course, since partial support would still be better than no support.

Does anyone have any ideas about any of those steps? Anyone want to help with any parts? Anything I should know before I jump in?

I have coded for the Wii homebrew channel before (appropriately, I was writing emulators for other Nintendo systems for the Wii), so I have a rough idea how the graphics system works, and I'm very familiar with the input side of things, and I've had a look at some of the Dolphin code and compiled it before. But I could still use any help people want to offer.
Well, I got head roll tracking working! That's a start.
Nice that you're working on that Smile This could definitely be a cool feature. Merging it with the main codebase might be tricky (lots of things will probably have to be different for VR headsets) but I'm sure it could be done.

If you need any help with Dolphin's codebase, you should join us on #dolphin-emu @ efnet, we reply faster on IRC than on the official forums.
Interestingly, the codebase already allows for freelook, although not for roll. But I added roll easily enough using the same pattern. And the code already allows for stereoscopic 3D anaglyph rendering, but I don't think it's using the right method (3D is supposed to be off-axis projection from different positions, with no toe-in camera rotation). So it's not as big a change as I expected.

But now I need to start separating what are currently single concepts. For example "screen width" is simple enough now, but when I split the screen in two there will be a whole screen width, and an eye width, which are separate concepts. And currently viewports are simple enough, but when I use viewports for each eye, it becomes more complicated.

Anyway, I'd better get back to it...
I had a go at getting the right aspect ratio so that the view is extended vertically. The widescreen hack works for Perspective projections, but not for the 2D orthographic projection screen elements. That means that menus and cursors and things like that often don't line up, for example in Dead Space Extraction. I fixed that by multiplying the orthographic projection matrix elements by the same widescreen hack factors, and now cutscenes and 2D elements are almost where they should be (I can't work out why it is ALMOST right, but not quite right, though).

The problem is, that breaks some games, like the Lego games, which no longer seem to be clearing the depth or colour buffers correctly for areas outside the normal screen size, and everything outside the normal screen size looks like rubbish. I know it's not because it's not rendering anything there, because when I use FreeLook to look around it works great. So I have to either work out a way of detecting when they are trying to clear the screen, or maybe just clearing the extra area manually each frame, or maybe just letting games like the Lego ones stretch their 2D user interfaces.

I'm not sure whether I should be rolling the 2D elements too. I think perhaps I should (except when stretching the 2D) since they are usually supposed to line up with 3D elements.

I've noticed most games cull objects in some of the FOV area that I want to display. So I'm wondering how hard it would be to hack the game code itself to change the culling to a different size frustum. I remember seeing before that dolphin included a disassembler and debugger, but never had much luck with it before.

I've also noticed that games render things that should be attached to the face when you look around, render things that no longer make sense in the extended view size, and probably render some things at the wrong depths. So I may have to look at ways to filter what is rendered like generic 3D drivers do.

And I've noticed that roll affects things like reflections in an incorrect way (since they are also done with projection matrices), eg. in Mario Sports Mix, so ideally I should be detecting somehow when they are trying to render like a mirror.
Bumping to allow the OP to continue posting without triple posting.
(10-05-2012, 04:36 AM)NaturalViolence Wrote: [ -> ]Bumping to allow the OP to continue posting without triple posting.

I guess there's a rule against that? I didn't know. I only noticed the rule about no piracy.

I've noticed Dolphin is running slower than is ideal for Virtual Reality, but I'm guessing that can't be helped, and will be less of a problem for people with desktop computers or newer computers.

I don't want to add any slowdown though, and I need to render the scene twice, once for each eye, and at a higher vertical resolution and FOV. Where is the main bottleneck in terms of speed? Is it just in emulating the core, or is it the graphics rendering? Would it cause a big slowdown if I render everything twice?
I've heard of the Oculus Rift and it sounds exciting.

I have a question though...

Is it *theoretically* possible to make GameCube games compatible with this device? Would it be possible to say, mod or hack into F-ZeroGX's game code, so that it would support the head tracking feature of the Oculus Rift? Imagine if while piloting the hover crafts, we can look left, right, up, or down when we turn our heads accordingly. That would be absolutely badass!!!

How feasible is this, CarlKenner?
(10-06-2012, 07:45 PM)isamu Wrote: [ -> ]I've heard of the Oculus Rift and it sounds exciting.

I have a question though...

Is it *theoretically* possible to make GameCube games compatible with this device? Would it be possible to say, mod or hack into F-ZeroGX's game code, so that it would support the head tracking feature of the Oculus Rift? Imagine if while piloting the hover crafts, we can look left, right, up, or down when we turn our heads accordingly. That would be absolutely badass!!!

How feasible is this, CarlKenner?

Looks like camera rotation has already been implemented in "free look" (right click and drag to look around with it enabled). You'd just have to link it to the Rift's head tracking.
Quote:I guess there's a rule against that? I didn't know. I only noticed the rule about no piracy.

I don't think there is a rule against it but it's common courtesy to edit your existing posts with additional information instead of double/triple posting unless you're trying to bump the thread.

Quote:I don't want to add any slowdown though, and I need to render the scene twice, once for each eye, and at a higher vertical resolution and FOV. Where is the main bottleneck in terms of speed? Is it just in emulating the core, or is it the graphics rendering? Would it cause a big slowdown if I render everything twice?

Everything. It depends on a number of factors, which games, what settings you're using, what OS, which backend, your hardware etc.

It will likely cause some slowdown but whether the amount is noticeable or even measurable will depend on those factors above.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13