At GDC today, a number of VR and AR developers gathered in a casual forum moderated by Chris Pruett, who does developer relations for Oculus VR. What followed was an interesting jam session as creative minds shared their ideas, triumphs, and frustrations with virtual platforms. Pruett stated at the beginning that he was not there as a representative of Oculus, and in fact he was not the original planned leader of the session.
He began by taking an informal survey of the room. By a show of hands, he estimated that about 70% of the people there were actively developing, though only two people raised their hands when he asked if anyone had been working on VR or AR for five years or more. Most of the room also expected to release their product within eight months. Interestingly, not everyone was in it to make money. A few of them were in it for public feedback and planned to use that to iterate development, possibly into a retail product, but not necessarily.
They all agreed that motion sickness was a prevalent problem, and they discussed the ways that they were combating it in software (as opposed to the device itself using a motion-sensing camera to keep the user's head correctly oriented). There was a consensus around creating a virtual copy of the user's body within the world, but it had to synchronize with the user's movement, or else the disorientation and nausea would be even worse than it was without the copy.

Creating a goggle-like frame around the edges of the user's vision also helped (such as when a movie camera simulates looking through binoculars). Limiting navigation in the world and instead sending the content to the user was also beneficial, as was establishing a visual horizon and a virtual floor beneath the user's feet.
The general consensus around VR and AR was that it felt like the early days of 3D gaming in the mid 1990s. Meaning, developers were still learning how the technique works -- and doesn't work, and implementing hacks to create certain illusions when the hardware couldn't handle the processing requirements of doing it for real. Pruett mentioned a trick he'd figured out with mirrors. Ordinarily, mirrors are a problem in a virtual space, because they force the GPU to render a reflected 3D space in addition to what it was already handling, which can kill performance. His workaround was to lower the refresh rate to 30Hz, which looked fine as long as there wasn't a lot of movement in the scene.
The attendants largely did not mention their names or what they were working on, but conversation did start to flow once people started letting their guard down a bit. Some hesitation is understandable, since in many cases these people are working on products that aren't ready to for the public eye yet, and they may be using clever ideas that they'd prefer to keep to themselves. But the takeaway from this roundtable is that VR and AR could benefit quite a lot from developers sharing some ideas and discoveries with one another. Because while devices like the Oculus VR are amazing technology, they'll become historical curiosities without compelling content to drive them forward. In an environment as collaborative as software development, two heads are better than one.
More...
He began by taking an informal survey of the room. By a show of hands, he estimated that about 70% of the people there were actively developing, though only two people raised their hands when he asked if anyone had been working on VR or AR for five years or more. Most of the room also expected to release their product within eight months. Interestingly, not everyone was in it to make money. A few of them were in it for public feedback and planned to use that to iterate development, possibly into a retail product, but not necessarily.
They all agreed that motion sickness was a prevalent problem, and they discussed the ways that they were combating it in software (as opposed to the device itself using a motion-sensing camera to keep the user's head correctly oriented). There was a consensus around creating a virtual copy of the user's body within the world, but it had to synchronize with the user's movement, or else the disorientation and nausea would be even worse than it was without the copy.

Creating a goggle-like frame around the edges of the user's vision also helped (such as when a movie camera simulates looking through binoculars). Limiting navigation in the world and instead sending the content to the user was also beneficial, as was establishing a visual horizon and a virtual floor beneath the user's feet.
The general consensus around VR and AR was that it felt like the early days of 3D gaming in the mid 1990s. Meaning, developers were still learning how the technique works -- and doesn't work, and implementing hacks to create certain illusions when the hardware couldn't handle the processing requirements of doing it for real. Pruett mentioned a trick he'd figured out with mirrors. Ordinarily, mirrors are a problem in a virtual space, because they force the GPU to render a reflected 3D space in addition to what it was already handling, which can kill performance. His workaround was to lower the refresh rate to 30Hz, which looked fine as long as there wasn't a lot of movement in the scene.
The attendants largely did not mention their names or what they were working on, but conversation did start to flow once people started letting their guard down a bit. Some hesitation is understandable, since in many cases these people are working on products that aren't ready to for the public eye yet, and they may be using clever ideas that they'd prefer to keep to themselves. But the takeaway from this roundtable is that VR and AR could benefit quite a lot from developers sharing some ideas and discoveries with one another. Because while devices like the Oculus VR are amazing technology, they'll become historical curiosities without compelling content to drive them forward. In an environment as collaborative as software development, two heads are better than one.
More...
