The online racing simulator
Searching in All forums
(12 results)
aikakana
S2 licensed
Quote from adrianstealth :yes I'll give it a go without & report back.

I'm surprised your having such a result, I thought as long as everything in sync with your head movement you'll be fine

remember whilst in motion my head is moving around (vs track cam which is fixed on wall) but the visuals are in sync same as if no motion

the only thing I can think of is there any head wobble effect in LFS which should be off? thus the visuals are adding movement not corresponding to your real head

your system should eat LFS , I use no vsync no fps cap

Thanks, will be interesting to hear how you feel.

No the problem isn't with head movements, but lack of g-forces. When you try without the moving platform, try slowly stopping the car while looking out of the side window. That is the absolutely worst for me.
aikakana
S2 licensed
Quote from adrianstealth :hi

it's a custom setup ( professionally welded & powder coated )
-tried to post image but can't seem to do it on these forums

it gives all the movement I feel is needed (whilst being able to fit it into a house)

I feel no nausea at all , not tested without motion, I'm surprised you get nausea whilst sim racing, make sure your fps is high with no vsync - you need very low latency, turn down graphic details / no AA etc go for fps of above 150fps

Would you mind testing without motion one day?

My PC runs LFS very well (4.5ghz i7, gtx770), so that's not the problem. The problem when stopping the car is that I anticipate the deceleration ending and the nudge forwards that it causes. You know when you stop a real car all the passengers nudge forward. And when that doesn't happen my eyes go blurry, head feels dizzy and an instant hit of nausea to my stomach. I have thought that a motion platform with even the slightest amount of front & back tilt could fix the issue and make it comfortable. At high speeds I have no problems what so ever, but the slower I go the more pronounced the effects are and stopping the car is the most extreme case.

What I have read about the issue it's because of the vestibular system and visual system "fusion" conflict. There's plenty of research made for military sims that are available online and the conclusion seems to be just that. An other conclusion was also that the less real experience, the less simulation sickness and vice versa.
aikakana
S2 licensed
Quote from adrianstealth :re. dk2 with motion

I'm using a motion setup with the dk2 with extremely good results
the camera is in a fixed position on a wall

my platform pitches / rear traction loss
+seat motion on top of the platform

home setups such as mine do not make high movements! just enough to feel the onset of gforces + position indication

the dk2 movement actually adds to the motion effect
-chair moving slightly to the left to simulate ones body being moved to the side in relation to the wheel shows in the dk2 & gives good additional visual clues


something like the redbull simulator would obviously need a VR track cam mounted to the actual platform

http://youtu.be/rE-Fge3gN9w

Interesting. I have been on the edge to start investing into a such setup, but one reason why I haven't has been the tracking issue. Which system are you using?

Do you feel that you get less nauseous when using the moving setup? I feel really really bad hit of nausea and head dizziness when stopping the car and in slow speeds generally.
aikakana
S2 licensed
Quote from sinbad :I think with the complete removal of external reference points, the motion chairs could be very convincingly giving you some sense of longitudinal and lateral G-force.

Could the rift (or LFS) subtract the information which is sent to the chair? So if the chair was one of those high motion Force Dynamics type chairs, and under hard braking it was instructed to pitch forwards 20 degrees, would you be able to simultaneously subtract 20 degrees from the pitch information sent by the rift headset?

As far as I know it should be the Oculus SDK that does that, figuring out the platform orientation.

If LFS fixed the head orientation from the platform info I suspect it would lead to very jittery movements because the Oculus positional tracking uses the IMU too. When accelerating you lean backwards in such a platform and in-game you should stay still, but the Rift IMU will sense the backward acceleration and think it is being moved and thus it will be seen in the screen. The SDK should need to know how the platform is moving so it could take the additional movement into consideration in the sensor fusion. If the camera sees the headset staying in place (camera on the platform) but the IMU senses movement it will lead to a conflict and I have no idea how it would handle it. If the positional tracking was done only with the camera this would be a lot easier problem.
aikakana
S2 licensed
Quote from Ped7g :You should mount the tracking camera on the chair itself, that way you will still look forward (from Rift's point of view). Only head movements relative to the chair will be tracked.

There's no good solutions for this yet. What would be the easiest (to my logic at least) is to have a second IMU (identical to what the Rift has) attached to the moving platform. Then the Oculus SDK would know how the platform is oriented and could adjust the "down" vector accordingly. The camera could be either on the platform or not, wouldn't matter as long as it's implemented that way.
aikakana
S2 licensed
Quote from ronin17 :I'm trying to troubleshoot an issue I am having with my DK2. At random times my head position will reset to the drivers lap. I can't find a reliable way to reproduce the issue, and it doesn't seem to depend on any particular event in the game. I can be looking left or right through a turn, or even looking directly ahead and it will move my position.

I have no idea if it is related, but I noticed when I touch my monitor the head location jumps about half a meter. Doesn't happen every time and works only for a while, then the monitor needs to "recharge" hehe. Some magnetic / electrical interference might cause some jumps.
aikakana
S2 licensed
Quote from adrianstealth :I think vsync is needed if your PC dosnt do very high fps

-no vsync here as I have much better & more fluid experience with it off

( over 300fps )

I agree on this but only partly. I feel with VSync the experience is overall the best, but the jittery head motion with small movements is better with very high FPS. Probably due to less motion prediction issues. Timewarp to the rescue please =)
aikakana
S2 licensed
Quote from Scawen :I think it depends what we mean by "staying in place". My understanding is the warning stays in place relative to the headset, so it would still be visible if you looked sideways (that's what I mean by moving around with your head). But that is not what Oculus themselves recommend, as they suggest things should stay in place in the 3d space, so you can look around them (like LFS menus).

It does stay in place in space, not relative to your head, so you can look around it. I was surprised it handled that way because I had read exactly what you are saying, so it might have changed between the SDKs.

Edit: or wait a second, do I remember this all wrong? The background was moving and "stayed in place", so it might have been why it was comfortable. I need to check this out.

Edit2: sorry for the confusion, it really is how you describe, stuck to your face and not to the world. It is transparent and that made it comfortable in contrast to say loading screens that are totally static and make you feel ill. Your implementation of it is much much better.
Last edited by aikakana, .
aikakana
S2 licensed
Quote from Scawen :
Also their own one will swing around with your head, rather than remaining in place. So you have to swivel your eyes to read it. This is against their own code of best practice!

In the desk test demo it does actually stay in place. Might have changed between the two version 4 SDKs, I didn't see the previous one at all.
aikakana
S2 licensed
Quote from Alric :Oh sorry, I took 3d mirrors to mean they have depth & display the world in 3d not just what they display in 2d relative to our 3d head position.

I'm not sure if I'm helping here, but the result would be "just" 2D relative to each eye position, just like we have a 2D image per eye in the Rift itself. In other words what you are describing is actually all that is needed.
aikakana
S2 licensed
Has someone else noticed that the positional tracking does something weird with very small movements? I feel like it could be the prediction, because it feels my viewpoint is for a very short moment moved too far. With bigger and longer motions it's perfect, but for example in the menus it's very easy to notice and makes me feel a bit ill. Edit: Didn't notice this in the Oculus demo scene with the desk.

Would time warp perhaps help if the case is the prediction "predicts too much"? Time warp would update the view just before presenting so it would be much closer to where my head really is.

Btw, awesome work Scawen! In every other way your Rift integration is near perfect!
aikakana
S2 licensed
Quote from jasonmatthews :Time warp?

As far as I know it should be in there already, but you need to enable V-sync for it to work.

Now if only the tracking volume would be bigger we could walk around the car and then hop in. Then in one version when Oculus brings their hand trackers we could roll down the window and give the finger to passing drivers
Last edited by aikakana, .
FGED GREDG RDFGDR GSFDG