I spent the past two days up in lovely San Francisco at the Game Developers Conference at Moscone Center.
After a long drive up Tuesday night to our couches that were waiting for us in San Jose, we got up Wednesday Morning and headed into the city.
After a long struggle attempting to find parking, we got it figured out and made our way into the North Hall for registration, after a painless process involving nothing but entering my e-mail address into a laptop – I walked over to receive my ID card and my Expo Pass.
All that was left was walking across the street to the South Hall and entering the Expo.
What awaited us inside those doors was an amazing assortment of cool demos, interesting games, hundreds of displays and more buzz about VR than ever before.
If you thought that the VR craze would come and go, you’d be very wrong. While the commercialization might still be a year or two away – every month there’s another announcement, another competitor entering the ring, and I say the more players the better. If anyone has a chance of success, there needs to be some failures along the way.
Valve and HTC announced the HTC Vive headset on Monday and it uses something they call Lighthouse tracking with lasers to get highly accurate positional data with latency that they think is low enough to reduce motion sickness on its own.
Having tried out a demo for something called the Perception Neuron in a demo that allowed me to fly around in a virtual world with an Oculus Rift DK2 on my head and a section of the Neuron covering just my left arm – I can fully see the concern that some people have regarding motion sickness in these VR environments. The lower I got to the ground and the faster I went, the more parallax occurred and the more my brain got confused because the motion I was seeing didn’t correspond to the motion that my inner-ear was sensing.
Wednesday night, Cameron and I made our way to the Vicon/Motion Capture Society mixer where we met a ton of professionals in the motion capture field.
This is something that’s particularly amusing to watch as a bystander and is most likely a ton of fun. You put on these slippery shoes and get strapped into this little pad where the floor is just slippery enough for you to move around in but not so bad that you fall over. Your pressure against the waist belt is what propels you in-game and your vision and pointing/aiming are done using your head – via Oculus – and your gun using some kind of tracker.
If you haven’t heard of Project Tango from Google’s ATAP labs, here’s a video that will get you up to date.
I got to demo one of their newer Project Tango tablets and walked away very impressed. They had two demos they ran that were pretty eye-opening. Google’s goal with the Tango tech is to enable our devices to get a better understanding of the world around them and the first demo I saw did a great job of just that.
Our guide opened up an app and overlaid a 3D model of a fridge into the environment. He then instructed us to walk closer to it and tap on the screen to open the doors to fridge. As the doors opened I instinctively walked closer and went to check out what was inside, what I found was pretty cool. I could open up drawers inside the fridge and peer closer to see what fruits and vegetable were in them or open up the freezer and kneel down to look inside.
This new interactive concept was intuitive and came very naturally. The other demo was a game combining a helicopter gunship and zombies down below. You moved about much the same way as in the previous demo but this time your perspective was a God’s eye view looking down at a monument or house below with grounds that were crawling with zombies. I put the target reticle on them and tapped the onscreen button to fire away. Boom, I just killed my first augmented reality zombie.
The tough thing about Project Tango is that it isn’t VR, it isn’t AR, it’s using the motion part of our reality and combining it with a virtual one to give a different ‘perspective’ – something being tentatively called ‘transmogrified reality”