HTC's doing a little Spring cleaning, and it's starting with all the wires protruding from its headsets. The company announced today at Computex 2017 that it has partnered with Intel to create a "wireless VR accessory" for the Vive HMD. That accessory will rely on WiGig, the wireless tech Intel created to let you connect basically everything to your PC without having to plug in a single wire, and is expected to work with existing Vive HMDs.

This is how HTC described the accessory in its announcement:

The WiGig technology, based on 802.11ad standard, works solely in the interference- free 60GHz band, and enables high throughput and low latency in both directions, from the PC to HMD and from HMD to PC. This means pristine video quality with <7ms latency in any environment, supporting multiple users sharing the same space. All of this results in the seamless wireless VR with the Vive!

The extent to which the wireless VR is "seamless" depends on how much lower than 7ms latency HTC manages to get. Low latency is crucial to VR--too much can hurt your sense of immersion at best and make you feel sick at worst. The good news is we might not have to wait long to find out how well this accessory works: HTC said it's going to show off a proof of concept when it heads to E3 from June 13-15.

In the meantime, we can take comfort in knowing HTC isn't the only company that's chasing that wireless VR dream. AMD acquired an Austin startup called Nitero for its wireless XR technology in April, and we recently went hands-on with Sixa's Rivvr wireless VR system. Oh, and Oculus is also working on a self-contained, untethered HMD with Project Santa Cruz. Hopefully we can bid adieu to all these wires sooner than later.

Nathaniel Mott Contributed to this Report

Source: HTC

Comments Locked

9 Comments

View All Comments

  • boeush - Tuesday, May 30, 2017 - link

    So, here's what I don't get. If these wireless VR headsets are intended to display video content rendered on an external graphics card, then let's see: 90 [FPS - 'meh', I'd rather see closer to 120, but OK] x (1080 x 1200 x 2 [kinda 'meh', IMO, but OK]) x 24 [bpp - no HDR, sorry] =~ 5.6 Gbps

    WiGig is only capable of up to 7 Gbps (under ideal conditions, and only by simultaneously using 60, 5, and 2.4 Ghz bands in a tri-band configuration.) So how are they planning on "supporting multiple users sharing the same space"? (And what is 'multiple' - do they seriously
    mean more than 2 ?!?)

    Sure, they can try to compress the hell out of the video stream. But intensive decompression workloads would help kill the battery of the headset even faster than the constant-on WiGig streaming, video display to both eyes, and audio to both ears would be doing already...

    Also, what is 'XR'?
  • MrHollow - Wednesday, May 31, 2017 - link

    Well i think they meant multiple users but with a different PC and HMD and not one PC and multiple HMDs.
  • boeush - Wednesday, May 31, 2017 - link

    That wasn't the point: the point is all those users would be sharing the same spectrum in the same room, and within that room there are only so many bits that can be flying around at any given time (any more, and packets would be getting dropped due to signal interference.)
  • yhselp - Wednesday, May 31, 2017 - link

    Isn't there some compression happening along the way? HDMI 2.0 supports 4K@60fps with HDR, which by your calculations would require 28.8 Gbps, and yet HDMI 2.0 only allows for 18 Gbps. Is there something I'm missing?
  • boeush - Wednesday, May 31, 2017 - link

    Not sure how you get to the 28.8 number. 3840 x 2160 (4K) x 60 (Hz) * 30 (bpp, at 10 bpc) =~ 15 Gbps. At 12 bpc, it would be still only ~18 Gbps...
  • BoyBawang - Wednesday, May 31, 2017 - link

    There is one they called Foveated Rendering. The resolution is increased only on the small area where your eyeballs are focused. The rest is rendered in low resolution. Big bandwith & CPU saver. Needs an eyeball tracker to work.
  • boeush - Wednesday, May 31, 2017 - link

    A very rapid eyeball tracker, at that... (saccades are super-fast, and the eyes are constantly jittering on a small angular scale and at a very rapid rate that we aren't consciously aware of, and then there's the whole issue of head-motion on top of it all...) Insofar as foveated rendering goes, my impression is that the point was mostly to reduce load on the GPU -- though I can certainly see how it might be leveraged for video signal compression (still, this kind of a non-uniform compression scheme would be pretty novel, as far as I know.) And I'd still question the ultimate efficiency of any such compression vs. the impact on visual quality (and attendant nausea-inducing lag or compression artifacts...)
  • Luminair - Wednesday, May 31, 2017 - link

    They can compress it however they want, including with AV1 http://aomedia.org/about-us/

    Video compression/decompression is very resource intensive, which is why Intel and Nvidia like to talk about how well they can do it in every new product they try to sell you. Which codec they pick is a matter of cost and market size.

    I assume they'll pick h264 for the demo because common off the shelf parts can do it. But if this will ship in a year, it could use anything
  • Pat78 - Thursday, June 1, 2017 - link

    XR is Extended Reality. It is an umbrella term encapsulating Augmented Reality (AR), Virtual Reality (VR), mixed reality (MR), and everything in between.
    https://www.qualcomm.com/invention/cognitive-techn...

Log in

Don't have an account? Sign up now