I’ve been following Oculus Rift since the Kickstarter a couple years back, and while I didn’t help kickstart the project it has always been an intriguing idea. Of course Oculus ended up being purchased for a large chunk of cash and VC funding, but that’s a different story. Having tested DevKit 1 and DevKit 2, I was really interested to see what changes have been made with the latest prototype. The short answer is that ventilation has improved (less fogging up of the glasses), the display resolution is now higher, the screen refreshes faster, tracking is better, and combined with the VR Audio the experience is more immersive than ever.

To be clear, this is the first time Oculus Crescent Bay has been demonstrated publicly and the first time ever that Oculus has shown VR Audio to anyone outside the company. They held private screenings for the press and other "VIPs", and on the way there we passed by the Oculus booth that had a long line of people waiting to experience Oculus. Being able to jump the line and go into a private screening, I can’t help but feel a bit sorry for them. As for Crescent Bay, things have come a long way since the last time I tried Oculus (DevKit 2 at NVIDIA in September, if you’re wondering).

For Crescent Bay, Oculus put together a series of short demos that lasted about five minutes I’d estimate. All of these used positional audio, so as you turned or leaned in, you got a clear sense of the sounds moving around you. This isn’t anything really new, as we’ve had audio HRTF (Head Related Transfer Functions) doing positional audio for a while now, but combined with the goggles and stereoscopic 3D it’s very immersive. Oculus has licensed Visisonics’ 3D audio libraries, though they’re doing a lot of customizations to make things work with the Oculus Rift obviously. I had seen some of the demos before, and some of them were more in line with what you would expect from indie games; a few however were really designed to impress.

One was a city-scape that looked a bit like Gotham, with your view positioned on a platform high in the air. Looking down and stepping off the edge of the platform definitely gives a sense of vertigo, though the demo didn’t let you plummet towards the ground sadly. (And when you look down and can’t see your feet or any other representation of your persona, it definitely removes you a bit from the experience.) Another sequence has a T-Rex come stomping around a corner, similar to a scene from Jurassic Park. I was admiring the level of detail when the dinosaur puts his face right next to you, opens his mouth, and roars. The little bits of spittle flying through the air are a nice touch. Finally, there’s a slow motion on-rails sequence where your view moves forward toward a large alien robot with bullets, missiles, and even cars flying through the air – NVIDIA called this the “car flip demo” back in September. This was one demo where I definitely noticed the increase in visual fidelity thanks to the higher resolution display and the VR Audio.

In terms of the hardware, Oculus wouldn’t provide us with very many specifics of the display, but all indications are that they’re using a 2560x1440 OLED display like that in the Samsung Galaxy 4 Note. While they wouldn’t tell us the actual resolution, however, they did tell us some of the changes that they’ve made since DevKit 2. DK1 obviously was the starting point, and it used a 1280x720 60Hz LCD. While it looked okay, pixilation was very visible and there was some ghosting between images. For DK2, Oculus switched to a 1920x1080 OLED display, and they were able to drive it at 75Hz. They also use minimal persistence where the image is shown on the OLEDs for 2ms and then the image is blacked out, which works better with our eyes and doesn’t lead to ghosting as much – but it was still present at times with DK2. Crescent Bay has increased the refresh rate to 90Hz, with 2ms showing the image and then blacking out the screen, and that combined with an increase in resolution helps to improve the visuals even more. This was the first time I didn't notice any ghosting on Oculus.

One thing Oculus wouldn’t comment on is a release date. The hardware at this point could probably ship and people would be really impressed, but there’s a lot of work yet to be done with interacting with the environment and the user interface. I wouldn’t be surprised if Crescent Bay gets released to developers as DevKit 3 later this year, but other than some cool tech demos this isn’t really something end users would want/need just yet. It could easily be a couple more years before public release and by then we might see 4K or even 8K displays in the goggles. More important however is that we’ll need compelling games and other software that people can actually use, and that will take time more than anything.

I should also note that I was able to try a few other VR headsets at CES. The first was from SoftKinetic, and they mounted a forward facing 3D camera on the Oculus DK2 to allow you to interact with the environment using your hands. The demo involved reaching into the space in front of you to “grab” boxes, stack them up, and then you could whack them around and knock them over – all with your virtual hands floating in the air. This helped place you in the environment, but it is still early in development. The second was similar in some ways, in that it involved a forward facing camera mounted on the goggles, and it was at the Razer booth. You were supposed to hold your hands in front of you and fire and ice would appear in your left and right hands, which you could then throw at flaming or freezing floating skulls to “kill” them. It was a game of sorts, and the goggles use different software and hardware than the Oculus Rift, but the demo at least for me was a bit raw – most of the time my hands wouldn’t actually appear in front of me. Oculus also had Samsung's Gear VR (powered by Oculus) available, but the software being run wasn't at the same level as the Crescent Bay demo, and the hardware seemed more like a cross between DK1 and DK2.

There’s definitely a lot of interesting stuff being done with VR these days, and compared to the stuff I saw back in the 90s what we have now is truly impressive. Large polygons have given way to impressively realistic textures and models, and the positional tracking and latency are very good as well. It’s not perfect yet but we’re getting there. It’s going to be interesting to see who manages to release a public product first and what software we’ll end up using, and I’m looking forward to seeing more over the coming years.

Source: Oculus

Comments Locked


View All Comments

  • mkozakewich - Wednesday, January 14, 2015 - link

    Ha, I was thinking the same thing this morning. Well-designed games will already render things in lower detail when they're further away, so it would make sense to do the same thing when they're further away from the fovea.
  • JohnnyBoBells - Friday, January 16, 2015 - link

    I think you're missing a crucial point. While our eyes only see high resolution in a small area, we can still distinguish changes in our periphery rather easily. You can do your own experiments at home if you have the proper hardware and know how to code a bit, but what you suggest would easily be perceptible. Our eyes are simply too fast and can recognize such changes. You're asking a sensor to track the eye with such precision, but also with such speed that it can then send that data to the computer for processing, process it, and then change the output before the next frame is drawn. For this technique to work, you need to do it extremely fast (much faster than the refresh rate of the display device) for it to appear normal, but then if you have to do it that much faster, you're asking your hardware to do all that work that I mentioned in the previous sentence that much faster.

    High res graphics with VR were/are never meant to reach all users of gaming. This is one of those areas where you have to pay to play. It's not cheap, nor will it ever be. Current hardware is always playing catch up to the requirements of games at the best settings, and that's for monitor displays. VR requires so much more processing power than a monitor. I'll say it again. Playing the newest and visually stunning games on VR is not, nor will it be, for everyone. It inherently will always require expensive hardware.

    The sad thing is, many games that are a few years old play wonderfully on the Rift and other VR devices, albeit with current hardware. Unfortunately, most people don't want to play "old" games. They want the new games. If people were willing to play older titles, I think they'd have fantastic experiences. It's amazing, even at 75Hz, what a game which is able to be played at 75fps constant, even with older (2, 3, 4 etc years old), looks like in VR. While you always know you're in VR, it is so easy to get lost in the environment of the game world when the experience is absolutely fluid.
  • 2late2die - Tuesday, January 13, 2015 - link

    "on the way there we passed by the Oculus booth that had a line of people waiting to experience DevKit 2. After Crescent Bay, I can’t help but feel a bit sorry for them"

    Hmmm, maybe it was on a different day, but on Tuesday when I was checking out Oculus the line was to check out Crescent Bay.
    Now, I can't comment on exact specs but I was blown away by the demo. I even had a very cool moment during that last demo (slow mo on-rails sequence). The sequence had bullets flying around and some would pass through "you". Seconds into the demo one such bullet was going towards me and just as hit my imaginary body I felt a buzz/vibration - and for a moment I freaked out, I was thinking - how the heck did they just integrate haptic feedback into this thing!?! Then a second later I realized that it was my friend texting me and the phone in my pocket vibrated. :)
    But it was such an awesome coincidence and with me so immersed into the demo the vibration really did feel for a moment like it was coming from the area of my torso where the bullet was passing through, not my pants pocket.

    The biggest thing I would love to see with Oculus is something that would track my hands and put them into the VR environment. I think with legs there would be too many challenges, but hands are totally doable and that would add soooo much to the feeling of immersivness.
  • mkozakewich - Wednesday, January 14, 2015 - link

    Ooh, wire up a Kinect! They can be used synergistically!
  • sorten - Tuesday, January 13, 2015 - link

    Going from a kickstarter charity project to millionaires and facebook employees probably hasn't done much to add a sense of urgency for Oculus developers, but I can't recall the last product that took so long to go from development prototype to consumer product. I guess time will tell if the Rift ever shows up on store shelves.
  • BrooksT - Tuesday, January 13, 2015 - link

    Pretty sure Facebook management and shareholders will provide that sense of urgency. It's not a one-man show, and for all of Facebook's problems, it is professionally managed.

    Lots of products take more than two years to go from prototype to commercial release. Off the top of my head: iPhone, Xbox One / PS4, Google's cars + Glass + other stuff. There are lots more.
  • SleepyFE - Tuesday, January 13, 2015 - link

    Really? Cars? Thing that can kill you when things go awry are in the same category as XBONE and PS4.
    p.s.:I didn't put iPhones in there, because "they explode" and Glass is social suicide, so these to may be in the same category as cars.
  • HammerStrike - Wednesday, January 14, 2015 - link

    That is a silly statement - even a "refresh" of existing product lines, such as annual smartphone updates, likely have 18-24 month lead times, if not longer. I would suspect Apple and Samsung already have teams working on the iPhone 8 and Note 6. And those are "established" products where the updates are iterative (not to mention the resources of two of the largest tech companies in the world). This is a new product that has no existing manufacture of technology base/sector - it has to build most of the tech it needs from scratch.

    A better question would be to name a new technology that went from hypothesis to consumer availability in two years or less. I can't think of any.
  • SeannyB - Tuesday, January 13, 2015 - link

    There's always the constant commentary of the consumer Rift being "two years away", but I played Elite: Dangerous with the Rift DK2 (and a HOTAS controller) for maybe 10 hours, and for that particular game, which closely aligns with the "seated VR experience" that Oculus likes to talk about, I feel like all the pieces are nearly there. Improve the pixel resolution, sort out the driver situation ("direct mode" is still wonk), and improve the weight/ergonomics and then it's a totally marketable consumer product for a category of PC gaming. I hope Oculus doesn't get caught up in any scope creep over the consumer Rift needing to be perfect for every conceivable VR experience.

    (It's not totally ideal for first-person action games, but I have stomached hours of Minecraft on it and it's pretty dang cool reshaping its blocky wilderness in realistically scaled VR.)
  • mkozakewich - Wednesday, January 14, 2015 - link

    I've done hours of Minecraft on DK1, even! I think they should release as soon as possible (once software is all fixed up and games have been updated), and then save all the other features for later models.

Log in

Don't have an account? Sign up now