The online racing simulator
Quote from Scawen :Thanks for the tip!

You´re welcome!
Maybe the crash has something to do with our old gfx cards, which aren´t properly treated in the oculus code... who knows.
At least I have hope that lfs will run smooth, as long you have a similar hardware.
Don´t really want to buy a new rig atm just for the DK2.

P.s. Watch out for the white shark attacking you in the deeper water! It can be quite intense if you don´t expect him
Quote from Scawen :Thanks for the tip!

Ocean Rift worked very well and turned out to be my first VR experience.

In LFS DK2 code, I have got a grip on all the tracking (in a separate test program). So now I need to work on the graphical side. The distortion is completely different from the old system, so I don't know how hard it will be.

Isn't all the distortion handled now, by the OVR SDK? Though, I suppose that doesn't necessarily make it magically easy to implement, because you still need to work that into the pipeline, but at least you shouldn't have to be writing shaders yourself.

Either way, good luck! I'm really excited to race with my DK2
Quote from Tresch :Isn't all the distortion handled now, by the OVR SDK? Though, I suppose that doesn't necessarily make it magically easy to implement, because you still need to work that into the pipeline, but at least you shouldn't have to be writing shaders yourself.

Well there are two ways now, but neither of them is like the old way.

I find both ways quite complicated and confusing, compared with the old system. But I am reserving judgement. Sometimes it takes a little time and sleeping to get my head around things. Let's see if I can get one of them done today...
Personally I'm astonished that DK2 is incompatible with DK1, considering the whole point of the dev kits was to allow developers to get up to speed with programming for the thing.
I've tried LFS with DK1+G27 yesterday. 10 out of 10, nearly puked

No but seriously, before I got really sick, I've menaged do play for like 10 minutes. Scale seemed quite accurate, I was able to read most of the text despite DK1's low resolution and the floating HUD was nice.

I was driving FXR so I set my G27 to 540 degree in the profiler and in the game. It felt like I was looking at my own hands, it actually felt weird when I took my hands off of the stering wheel but in the game they stayed on the wheel. In VR having your virtual body really helps the immersion. I think that a proper steering wheel is mandatory, if you want to play LFS with the Rift.

I hope that the VR sickness will eventually go away. My brain just can't comprehend how can it be that I'm taking corners at 200km/h and yet I can't feel any G forces.

Some people report that DK1 was making them sick, but with DK2 they feel fine:
http://www.reddit.com/r/oculus ... ery_vr_sick_dk2_does_not/

LFS is back in the top 5 at Oculus Share. There is some people asking for DK2 implementation, maybe you should write a comment that you are working on it, Scawen.
Quote from Scawen :.... Sometimes it takes a little time and sleeping to get my head around things. Let's see if I can get one of them done today...

as usual :clapclap:
good luck by the way
Quote from Amynue :I hope that the VR sickness will eventually go away. My brain just can't comprehend how can it be that I'm taking corners at 200km/h and yet I can't feel any G forces.

Someone should try Oculus Rift on a motion simulator like Force Dynamics 401 or something like that
Rift is no-go on motion rigs, the headtracking would go completely berserk as it combines IR tracking with attitude based gyro tracking.
Quote from Matrixi :Rift is no-go on motion rigs, the headtracking would go completely berserk as it combines IR tracking with attitude based gyro tracking.

If the motion platform told Oculus SDK (or whatever is apropriatte) how much degrees it moved, then the rotational tracking could be corrected (camera for positional tracking can be hardmounted to the moving rig). Not that it is actually done, but I guess it is doable if someone really gets the motivation to do that.
Motion Rig
Quote from Matrixi :Rift is no-go on motion rigs, the headtracking would go completely berserk as it combines IR tracking with attitude based gyro tracking.

This is bull, its just a matter of how much motion you use. I'm using DK1 with a motion rig, and its no problem whatsoever. The added clues from the motion rig makes the immersion perfect.

It is also true that in DK2 you can mount the camera on the rig, and therefore all movement of the head will be in relation to the rig and correct.

The only thing that will throw it off is continues rotation, which indeed throws it off. Tried the Dk1 in an 401cr, and it did not work.

I must admit I have not tried DK2 yet, since the 2 I bought is sitting in my office, and I am on vacation, but I'll hook it up the one of the motion rigs next week and report back as soon as Scawen has managed support.
Quote from oleroc :I'm using DK1 with a motion rig, and its no problem whatsoever.

Quote from oleroc :The only thing that will throw it off is continues rotation, which indeed throws it off. Tried the Dk1 in an 401cr, and it did not work.

Some contradiction there.

I've used both 301 and a DK1, they do not work together as your head shakes around so much it makes everything a blurry mess. Perisoft himself (creator of Force Dynamics) said on IRC that Rifts won't really work with proper motion rigs at this point if you want a realistic experience.
Quote from oleroc :It is also true that in DK2 you can mount the camera on the rig, and therefore all movement of the head will be in relation to the rig and correct.

DK2 takes data from both IR and gyro sensors to calculate the tracking. Even if it was purely IR based, the movement of your head would still be completely unrealistic compared to actually driving a car. The effect would be similar to Gran Turismo cockpit shaking at high speed, you just can't see anything.
Quote from Matrixi :Some contradiction there.

I've used both 301 and a DK1, they do not work together as your head shakes around so much it makes everything a blurry mess. Perisoft himself (creator of Force Dynamics) said on IRC that Rifts won't really work with proper motion rigs at this point if you want a realistic experience.

DK2 takes data from both IR and gyro sensors to calculate the tracking. Even if it was purely IR based, the movement of your head would still be completely unrealistic compared to actually driving a car. The effect would be similar to Gran Turismo cockpit shaking at high speed, you just can't see anything.

No contradiction, rotations throws it off...

Davids (Perisoft) definition of a proper motion rig is absolutely open for discussion, and his main project is the 401cr at the moment which does not work with OR, unless you turn off the fourth rotational axis.

As I stated in my post, its all a matter of motion control.

If the settings for the 301 you tried, made your head shake violently, it's misconfigured. When I race on a real track, my head does not shake violently, unless I crash.

If you program the 301 correctly, the oculus works just fine..

I think I should know.....
It's not really a matter of configurations, it's simply how rendering and headtracking works on the Rift.

A simple test that anyone can do and understand: Focus in to any point from 1 meter to infinity, then quickly move your head up and down. You can clearly keep track of the point, because your eyes are tracking it and keeping it in focus while your head is moving up and down.

In virtual reality, your head is being yanked around violently in motion rigs and your whole viewpoint is moving but your eyes cannot track it. You can kind of simulate this in real life by not focusing in to anything, and quickly moving your head around. Everything you see will be nothing but blur.

VR without eyetracking combined with motion rigs is a poor experience. But if you're happy with it, then carry on.
Quote from PeterN :Personally I'm astonished that DK2 is incompatible with DK1, considering the whole point of the dev kits was to allow developers to get up to speed with programming for the thing.

Well, it's designed so developers can start to make content for VR, which is a much bigger deal than just implementing an SDK. It's also designed to get out to a wider test audience, for testing and iteration. A pre-release project is not the time to worry about spending a bunch of time on backward compatibility. I'd rather they be making leaps and bounds on technology, improving the quality experience, rather than spending a lot of time on money to maintain support for an outdated, alpha test rig.

I do believe they're going to need to put thought into making sure that certain things are more backward, and forward, compatible in the future though. Once it becomes a real product, they really need to find out how to make it so that their Version Two product doesn't require patching from all the game developers.
Quote from Matrixi :It's not really a matter of configurations, it's simply how rendering and headtracking works on the Rift.

A simple test that anyone can do and understand: Focus in to any point from 1 meter to infinity, then quickly move your head up and down. You can clearly keep track of the point, because your eyes are tracking it and keeping it in focus while your head is moving up and down.

In virtual reality, your head is being yanked around violently in motion rigs and your whole viewpoint is moving but your eyes cannot track it. You can kind of simulate this in real life by not focusing in to anything, and quickly moving your head around. Everything you see will be nothing but blur.

VR without eyetracking combined with motion rigs is a poor experience. But if you're happy with it, then carry on.

Ok, I'll spell it out for you.. I configure the motion so that your head does not get yanked around violently. If your head behaves that way in a motion rig, its all wrong...

The overall motion is also dialed down, since you don't want that much motion when you use the OR.

As it is, its absolutely possible to control most aspects of a motion rig, including eliminating unwanted violent shaking..

You are absolutely welcome to stop by and try it out btw. if you happen to be nearby....
Quote from oleroc :Ok, I'll spell it out for you.. I configure the motion so that your head does not get yanked around violently. If your head behaves that way in a motion rig, its all wrong...

The overall motion is also dialed down, since you don't want that much motion when you use the OR.

As it is, its absolutely possible to control most aspects of a motion rig, including eliminating unwanted violent shaking..

You are absolutely welcome to stop by and try it out btw. if you happen to be nearby....

In that case all you're doing there, is compromising the motion rigs simulation just to use the Rift. Again my point that Perisoft was also talking about, is if you want a realistic true to life experience (which is what motion rigs are all about) they do not go well with the Rift - atleast until eye tracking is implemented.

My time in the 301 that was fully configured for maximum realism was comparable to the G-force head movements of an F1 driver. The viewpoint shook around, a whole lot. South City with the BF1 was basically undrivable with all the bumps and vibrations that the motion rig was throwing at my body. Street cars are propably a lot more mellow and suitable.

If I'll be visiting Norway, I'd be happy to be proven wrong though!
Quote from Scawen :Well there are two ways now, but neither of them is like the old way.

I find both ways quite complicated and confusing, compared with the old system. But I am reserving judgement. Sometimes it takes a little time and sleeping to get my head around things. Let's see if I can get one of them done today...

If I may offer my opinion on the matter, I think in the long run, probably having the Oculus SDK do as much as possible would be the better route, as it will make the implementation more resilient to changes to updates in the future.

Granted, all my Oculus dev so far has been in Unity, which is basically like cheating So I'm sure you know what's best, but if you're having a hard time deciding, that's my perspective on the matter!
Some text I just posted in the LFS thread on Oculus forums:

Quote :Hi, I'm struggling along here with the DK2 support. It's a frustrating process, as the new SDK hides all the info about the Rift. The SDK just tries to give us code, in a failed attempt to make it simpler. But of course, their code cannot possibly just slot into an exising program, so the process is much more difficult than expected. They don't provide distortion constants and dimensions as before. If they did, you'd have LFS already. I have got all the tracking done, I'm now working on the distortion mesh. LFS did generate its own one before but now must use one provided by the SDK, as the info to build your own is simply not supplied.

One thing that stopped me in my tracks was that there are false images in the Developer Guide. The images look like the ones for the DK1, where the screen was wider, so the images kind of overlap in the middle. In fact with the DK2 the screen is smaller and there is very little peripheral vision. It's like being in a swimming mask, so I guess this will be one of the main criticisms of the hardware. The images don't agree with what you expect from the values "LeftTan" and "RightTan".

The two images don't overlap in the middle. It seems to me that DK2 screenshots are more like two images with a very similar shaped outline - the centre of each eye's view is very close to the centre of that half, but slightly towards the outside rather than the inside. In fact the result is the right eye can see slightly further left than the left eye can!

I'm working on the "Extended Mode" support. The other mode is completely baffling. Anyway I'm on the case full time, think it's becoming clearer but it's a big change...

Thanks for the update, stay persistent!

One way to make the blinkers/swimming goggles effect more bearable, is apparently to set the eye relief almost completely to the far end, 3 clicks from the end seems to be a good setting according to a lot of people. Then the FOV is optics limited, rather than screen limited.

DK1 felt better the closer the lenses were to your eyes, but DK2 seems to be the opposite. Suppose it has something to do with the larger lenses.

*Edit: A discussion with Oculus' chie ... Antonov about the 0.4 SDK
Progress...

See how the outline is very different on the DK2.

I've been driving around. Still need to move the view with the position tracking data and there are quite a few loose ends to sort out.

The blue is just a debug background.
Attached images
lfs_00000006.jpg
wow, scary to see what have to be done . The image is really ... twisted
Anyway good for you if your efforts are rewarding
Yesterday I've tried something new. I set the eye relief to the middle setting (previously it was set to "as close to eyes as possible" for maximum FOV). I've played with some demos for like half an hour and didn't get sick. Today I've tried LFS and after ~15 minutes I feel fine. I've also turned the mipmapping off wchith makes everything a bit more sharp in the Rift. I was driving slowly so maybe that helped. After breakfast I will try some full speed driving and see if the breakfast stays in my stomach

I've also noticed something while driving MRT. Car doesn't cast shadows in the interior view. Normally that wouldn't be a problem since you can't look down, but in the Rift it's quite noticable. Would be nice if you could add an option to turn on the car shadows in first person view. And 3D mirrors would be nice, expecially with DK2's positional tracking.

I've also noticed that if i pull the Rift's USB plug off and start LFS it crashes, but it's not something I'm supposted to be doing so I don't think it's a problem.
Hmm, if only creating in-car shadows was as easy as just "turning it on" :-)

FGED GREDG RDFGDR GSFDG