The topic of getting sick in VR (often called \\"simulator sickness\\") is a complex one. The simplified core of the issue is that when your visual perception doesn't match what your body (and especially inner ear) feels, many people feel sick. This is obviously not exclusive to VR- it can manifest in cars, airplanes, spaceflight, amusement park rides and more. Note that whatever the mechanism is for this in the brain isn't 100% tuned to all exact physical experiences. Many people get sick on a roller coaster even though their motion perfectly matches what they see- just the brain didn't evolve to be prepared for tight accelerating turns, changes in the gravity and so in those cases the mismatches between reality and brain-reality are enough to cause the problem.\\r\\n\\r\\nSome people even get sick playing first-person video games without VR. The types of motions in a typical FPS with lots of dashing around, quick turns, and strafing movements can be especially difficult for people who experiences these issues.\\r\\n\\r\\nIn VR I think of two main sources of mismatch between what your body feels and the photons entering your retina. Both can be equally important for VR experience designers to consider and both can be very complicated to get right. System-mismatch is the mismatch caused by the core VR technology. At some basic level a VR system is measuring your head's current position, velocity and acceleration and rendering a frame. Ideally that frame will be positioned so that it exactly corresponds to your head position at the moment in time the photons hit your retina. The other type is content-mismatch. This is by-design movement of the camera/head controlled by the experience that doesn't exactly match the player's physical movement.\\r\\n\\r\\nFor the fundamental technology we are working on at Valve we spend a lot of effort on system-mismatch. There are a number of techniques to address this, but they involve accurate tracking of head position, prediction of future head position, reducing latency in the rendering pipeline, deferring locking in head position in the rendering pipeline, adjusting rendering at the last minute, making sure rendering actually makes framerate (since there is no worse added latency than missing a frame totally) and finally reducing latency between your GPU and the actual photons being emitted (HDMI has a fair amount of latency and this relates to why wireless displays are very difficult).\\r\\n\\r\\nAs I said, most of these are issues for the core VR system. As a content/experience/game developer there isn't that much you can do to change the tracking accuracy of whatever VR hardware you are using. Your main challenge is to never miss framerate. On the HTC Vive we run at 90hz so overall there are 11ms per frame. But there is plenty of overhead between various system things, the compositor, GPU scheduling, etc, so realistically you only have about 8ms to render your frame. It is critically important to design your content so that it fits within budget 100% (or at least 99.9%) of the time. Ideally game engines will help since its really hard to tune content to be consistent- most environments have something simple most of the time punctuated by some really complicated stuff occasionally (when all the explosions happen or something). If your game engine can automatically adjust and reduce render target sizes, AA, texture mips, etc, that can help a ton.\\r\\n\\r\\nContent-mismatch is however a more complicated issue. In general I'm a fan of never making your players sick. Its true that there are some people who are less susceptible but its not clear to me that they are actually comfortable after spending a bunch of time in an experience that is questionable. I'd love to do some research where we wire some of them up to HR, breathing, and other monitors and have them play an experience that pushes the boundaries for a full hour. Personally I'm middle of the road- I'm far from immune to getting sick but not nearly as bad as some. Part of what confuses some of the debate so far is that so few people have extended time in VR. When I'm playing an experience that moves the camera I'm usually fine for 2 or 3 minutes (typical demo length), feeling just a little \\"off\\" at 5 minutes and it doesn't get bad for a bit longer. But VR is still not going to be successful if 80% of people are feeling even a little uncomfortable after 5 minutes. They might not even realize they feel sick, but they certainly won't enjoy the experience the same way they would if they were in well designed experiences that didn't have these issues.\\r\\n\\r\\nWith all of the above preface, its useful to discuss what you can actually do. Most of the demos that Valve showed at GDC and since then had almost no forced-camera movement. We wanted to be especially careful to make sure people had great experiences while we are still learning what works. At this point I don't think we can definitively say that we know what works, but there are a bunch of techniques that look promising.\\r\\n\\r\\nIn general it seems fine to have the player in a slow-moving elevator with good visual references. CloudHead Games does this well in [The Gallery](\"http://www.thegallerygame.com/blog/\" "\\"The") demo. Cockpits seem to help so driving/flying airplanes/etc look like they might work when done carefully (with the caveat that many people get sick in the real thing with these scenarios). The agency of directing the vehicle (car/airplane) yourself seems to help some. I have also seen good results with locomotion with teleporting- I often build prototypes that let you use one of the controllers to point to where you want to be and press the button to teleport yourself there to move over distances greater than the physical space you have available.\\r\\n\\r\\n[RoadToVR has some interesting stuff about experimenting with locomotion at the Austin VR Jam here](\"http://www.roadtovr.com/vr-devs-experiment-with-vr-locomotion-at-htc-vive-vr-jam-in-austin/\").