Oculus Rift Support?

Out of curiosity, what were your experiences in terms of time between ordering + receiving a unit?
-Rob

I’ve made some progress on a Rift CHOP. Getting data from the sensor is very easy, as is applying the barrel distortion shader.

I’ve gotten stuck on getting the output to look right in the headset. I can get it pretty close by manually tweaking parameters, but that’s not a real solution. Right now, I’m extracting a custom projection matrix via CHOP channels and applying those to my cameras, but something is off, lines aren’t properly converging.

I could describe my problem but I think it’s pointless unless someone else is working on this as well. Pipe in if you’ve got one and you want to hack on the problem together :slight_smile:

rob - I ordered mine on April 2nd, received July 12th.

There’s an Oculus unit at our local hackerspace and I’ve just installed 77FTE on the machine there ready to have a mess with it. not sure how much help I can be though, apart from testing. I am a bit clueless with GLSL etc. and struggling along with Python (although I know Touch well in general so might be helpful after all)
rod.

I want a local hackerspace :stuck_out_tongue:


I’ve created a GitHub repository in the interest of fixing / improving my attempt at Oculus Rift Support in Touch Designer.

github.com/momo-the-monster/MMMRiftChop

The repository includes:

[]A Visual Studio 2012 project for modifying/compiling the DLL[/]
[]A TD 088 Composition with my attempt at implementing proper rendering[/]
[]A compiled DLL if you want to get straight to the action (it’s in the TD folder with the Comp)[/]

It’s so close - the head tracking is fine, the 3D effect is there, but something’s not right. I think it has to do with the offset for each eye (ie the ViewAdjust matrix to be applied after the projection matrix and barrel distortion). I’ve tried the values that make sense that are coming out of the Rift Sensor: 0.0395 offset for each eye, in my case - but I still see some sort of de-convergence in my periphery. I don’t see it in the pre-compiled demos or when I render a scene in Cinder, so it must be a step I’m missing, or some parameters I’m not applying right.

The basic flow I’m using is:

[]Extract a 4x4 Projection Matrix from the Sensor for the Left Eye[/]
[]Flip coordinate at M[2][0] to make Matrix for the Right Eye[/]
[]Create one camera for each eye, use custom matrices generated above[/]
[]Use Render TOP to render each eye separately at 640x800[/]
[]Feed output of Render TOPs into GLSL Shaders created from example code[/]
[]Shaders are fed incoming Uniform parameters from Rift sensor[/]
[]Feed output of each shader into a Crop TOP, extending the left eye to the right and the right eye to the left[/]
[]Composite the Crop TOPs using Additive Blending[/]
[]Send final output to Window set to 2nd monitor, fullscreen[/]

I’ve tried adding a Transform Top after the Crops in order to change the eye separation but none of the values from the sensor fix the problem I see.

We’re so close! Looking forward to some input on this, hope someone else has a Rift they can test with!

Excellent news! I’m expecting my rift to arrive in a couple weeks so I will test then.

Rob, delivery times have improved, I ordered mine on the 30th of July it went through processing for shipment this week, so I might see it before the end of the month. I hit it lucky with ordering though, slipped in on a Non-EU/US country shipment, my order is about 2000 units ahead of the current US orders.

I might know what’s wrong. I have a Rift and will look at your example tomorrow. Totally stoked. Thanks for doing this work. I have stereo camera component for TD and it works well. From what you describe you might be missing the horizontal shift necessary for an asymmetrical viewing volume.

hmmm. I was wrong. In reading the SDK documentation, it seems the rift does not need anasymmetrical frustum. I’ll keep testing.

The only thing I found is that the cameras weren’t being translated. So I changed the camera parameters to use the Interpupillary_distance in the riftData (±ipd/2) and that I think made things more comfortable. Not sure if solves everything. I have to test with some different geometry.

Looks like there’s something not quite right with the shader. I’ll take another look tomorrow after I digest the documentation for the SDK.

I modified the shader and changed the parameter values. Seems to work now. But still hangs TD every time I tried to exit. Next thing to fix :slight_smile:

Also no chromatic aberration fix in the shader.

I’m attaching the modified .toe
MMMRiftChopTest.12.toe (10.8 KB)

I put the System::Destroy() call after the delete instance call and now I can exit TD without a hang.

Hi mlantin - thanks for jumping in on this!

Good catch with the System::Destroy() call, though it doesn’t seem to hang on my system without it.

Your version looks even further off in my headset - is it looking right for you? I noticed that you changed some values in the shader, namely:

ScreenCenter-float2(0.25,0.5), ScreenCenter+float2(0.25, 0.5)

to:

ScreenCenter-float2(0.5,0.5), ScreenCenter+float2(0.5, 0.5)

Can I ask why? Maybe you’re accounting for that somewhere else? The first line is verbatim from the Oculus documentation.

Hmmm. It looks right in my headset though I did notice that when the sphere is very big it starts to be uncomfortable. That might just be too much disparity. I’ll investigate.

In terms of the shader, I changed the value because the shader in the doc is designed to work on the full viewport. Since the way you have it structured, the shader runs on only half the image, the sampling needs to be clamped differently.

Thanks for the explanation! Turns out I had the wrong setting loaded on my headset, it was still calibrated for a friend. Also, sometimes I have to just look at the scene for a few seconds and my brain seems to automatically adjust things. I think your method is now looking better than mine.

I’m updating the Chop and example file, will post up a new version soon.

New version pushed to github:

[]Uses Oculus SDK 0.2.4[/]
[]Shuts down cleanly[/]
[]Uses mlantin’s shader/setup fixes[/]
[]Shader fix for Chromatic Aberration[/]
[]DLL only outputs configuration data once (optimization)[/]
[]DLL compiled in Release mode[/]

One thing I noticed in this version, mlantin, is that the Barrel Distortion feels a little bit off in the sphere, as you mentioned. I feel like the FOV isn’t quite right, but we’re inching ever closer :slight_smile:

I think I fixed another issue - I moved the lens separation math from the cameras to after the render. I think the camera separation is accounted for when I flip the custom projection matrix and feed it into the cameras. What was missing was a post-render separation for display in the lenses, which I now do via a Translate TOP after the shader and before the crop.

I also added in a bunch of Instanced cubes and attached a spotlight to your head :slight_smile:

Cool! I can’t wait to go home and test!

Okay I found a bit of time to look at this again. It’s so much fun. I wish I could do this all day :slight_smile:

Few things:

  • I simplified the flip comp
  • I simplified the distort shader by making it operate on both images at once. I’m sorry I didn’t have time to integrate the chromatic stuff you had added in :frowning:
  • I noticed that the scale parameters needed to be modified by the aspect ratio. That seems to make things a bit more comfortable.
  • The cameras do need to be shifted. This is the view matrix they mention in the doc.
  • There should be no need for an additional post-shift of the images

for myself, I made the IPD a bit smaller since my actual IPD is 54mm. Apparently that is the minimum that Oculus supports. They say in their doc that 64mm is average but in my experience that is not true at all. People that I’ve measured rarely get to 60mm.

In any case, the discomfort with far objects is because the objects are farther apart than eye distance…so going wall-eyed (divergent eyes). I have to investigate further.

I’m also trying to make a camera with the right viewing parameters without using the projection matrix from the stereo util…So far…major headache inducing problems.
MMMRiftChopTest.22.toe (11.6 KB)