It'll be nice to have more high end non-Meta options. Meta's headsets are a pain to use for PCVR. Third party software like Virtual Desktop is basically mandatory for Meta and PCVR.
(For the curious, this website I maintain about setting up a particular flight sim with VR documents all the meta-specific tweaks and caveats I've found: https://www.8492sqdn.net/guides/dcs/performance/)
I'll admit that I'm not up-to-date with all the updates and developments, nor do I play DCS, but I've never felt like Virtual Desktop was required. I just hook it up with a cable, open the Oculus app and then I can use whatever runtime or software I want. That's the way it always worked for me.
It depends on your network though. In my case the image quality was good, but going to the link cable was a substantial improvement in quality and latency.
I’ve not been using Virtual Desktop but regardless, yes, PCVR on Meta headsets is more of a pain than it needs to be. At best it’s a second class citizen to the onboard experience, which as a PCVR user I have zero interest in (there’s no good reason to invest in a library that’s locked to Facebook, and onboard capabilities are uninspiring compared to those of a PC). It seems like they break Link functionality every other software update, too.
BSB2 is definitely on my radar as a replacement for a Quest 2. No inside-out tracking is a bit of a letdown (I don’t relish the thought of setting up lighthouses) but a dedicated no-nonsense PCVR experience combined with the weight and bulk savings would be worth it.
1. Meta's Link software uses highly suboptimal video streaming settings. Makes a big difference in flight sims where you need to see fine detail on instruments and distant aircraft.
2. With Virtual Desktop, I can remove my headset during a long flight to grab a snack or use the restroom, then put the headset back on and resume. With Meta's software, taking the headset off requires fully restarting both the sim and headset.
You redirected me to this comment from my other one up the thread. But for #1, your blog post mentions that going with the USB cable option provides the best quality, which is what I use. Maybe a better wording is that VD is preferred for people who want to use it wirelessly.
For #2... that sounds like some kind of a software issue that only applies to DCS? Or maybe some obscure issue with the headset software? When I take off the headset in its normal link mode, it will usually pause whatever's going on PC-side, but I can just press the power button and keep it running if that's required. Never had an issue with this, let alone something that requires restarting the whole headset.
Is VD actually better than USB link? If that's the case, I might look into buying it. I thought that the post reaffirmed this, but I may be wrong.
I was under the impression that the inconvenience of using a USB cable is compensated by the higher throughput/lower latency of using a direct cable connection vs. a theoretically more limited wireless connection. Does VD simply have a better compression algorithm than Link, or can it actually push through more data?
I'm just pointing out different options that exist. Virtual Desktop is WiFi only but ALVR can do either.
If you're not capped by bandwidth because you can hardware encode into H265 or AV1, it shouldn't make a difference (besides the bugs and software quirks in both programs)
It offers better compression algorithms, and importantly exposes settings for you to fine tune for your network and GPU. If you have good enough network equipment you may be able to push more data over wireless than wired (most motherboards are bottlenecked by the USB controller bandwidth).
It sounds great, but I'm still not sure what they'll have to do to make it worth an upgrade for people who already use a Steam Deck and a HMD. I already use the Steam Deck like this often, just using an HDMI cable and an adapter to connect it to a Quest 3 for a giant display, but I can also run all the Quest window management on the side without taking resources from the Deck, and if I feel like continuing on the Deck alone, I can unplug the cable and keep using it on the small screen. It's a pretty nice setup.
I'm wondering if it would be worth just getting whatever adapter they come up with and the next gen Steam Deck to use the same way rather than investing in Deckard, but I'm interested in seeing their case!
I love that eye tracking is built in this bodes well for my VR hypothesis:
The killer app for the VR platform is eye contact.
In my opinion Zoom should be investing heavily in this, and Zoom’s market is the one newcomers stand to gain.
The giant feature that Real Life™ has and teleconferencing lacks is the subtle baton passing, engagement reading, and tone modulation that is enabled by eye contact. A square of video just isn’t the same.
My corollary hypothesis for why this hasn’t been done yet is that the Mark Zuckerbergs and John Carmacks of the world are on the autism spectrum, they get by without this information, and therefore don’t realize how much the rest of us rely on it.
I'm a UX Engineer. My phone was ringing off the hook to build VR meeting software in the early 2010s, and I couldn't understand the appeal of either seeing people wearing computers on their faces, or in having conferences with cartoon avatars.
I do. During the pandemic I experimented with some colleagues with VR meeting software like Spatial and Arthur. I thought they were great. These days there's also Viverse which is also impressive. And then there's Microsoft Mesh which we get for free but which is ridiculously behind all the others. It's cool but there's nothing you can actually do in it. You can't pull in a presentation, movie clip, 3D object, collaborate on a whiteboard etc. You can just watch a teams meeting and play some games and roast marshmallows.
For a regular "death by powerpoint" meeting where 95% of participants are just being bored by a single presenter, no, not for those. But for brainstorming workshops, yes definitely.
The avatars are cartoony but for me that doesn't matter. They represent the person for me after a small adjustment period.
It gave me a much stronger feeling of being together. And it's much easier to break out into little groups, much much easier than those 'breakout rooms' in teams which are pretty inflexible. Here you can just walk over to another group just in real life. It really felt like being there with them and after taking the headset off I had this feeling of suprise of being at home.
It was also great to be able to import 3D models of our products and discuss them.
> In my opinion Zoom should be investing heavily in this, and Zoom’s market is the one newcomers stand to gain.
I'm genuinely unsure if you're saying all this with a wink: we're into year 10 of VR Neue, it's been a tarpit for all who got in, and it sounds like an absolutely horrible idea for the video call company to dump money into VR for...eye contact?
Note the headset that supports this is $1200. And that's par for the course for eye tracking AFAIK.
I love VR and against my rational side, still invest in a premium headset every year two.
Price, historical market, corporate finances, and responsible stewardship aside, I absolutely cannot wrap my head around the idea that VR eye contact with an uncanny valley head replica and/or cartoon improves eye contact over...just seeing your eyes...so much that it's a killer app.
(my poor unfortunate Vision Pro and it's rapid dropoff in sales seem to indicate this very strongly as well)
Many people don't want to strap a device to their face for work meetings, the fidelity loss is acceptable to them.
I think the killer app is smart glasses for AR (non passthrough), and maybe virtual monitors with Immersed + Visor (when that ships), low likelihood imo still
Devices will not be mainstream until they reach a sufficiently small form factor
That's basically what the Apple Vision Pro is designed to be, in terms of UX and software. It just turns out we're still a long way away from the hardware being small and light enough to make that work.
I think it's more what the Hololens 2 was. Non passthrough, so not a VR device like AVP, something where i still see the real world with my own eyes and virtual objects are overlaid
Companies ditched that approach because it sucks. It needs all the same hardware as just sticking the screens in front of your eyes, except actually it's even bulkier and heavier because of the stuff needed for the combined optics to work, and for obvious reasons it makes a bunch of actual AR/XR functionality impossible, like anything involving dark colors in a lit room.
Do the adjustable lenses work if you have an astigmatism?
This biggest concern I have about VR, especially for work, is that it forces you to spend too much time looking a screen that is very close to your eyes. This is known to cause myopia and digital eyestrain.
Do any VR headsets attempt to address this problem? Can a headset force your eyes to change focal distance either using the display, or more likely, a physical lens? Ideally the headset would slowly, but consistently, force you to change your eyes focal distance. Is that something that Eye Tracking would enable?
Also, does eye tracking work properly if your eyes are slightly misaligned?
As far as your eyes are concerned the screen isn't actually that close, the optics in VR headsets are arranged such that the perceived focal distance is always about 2 meters away. That can have the odd effect of "fixing" nearsightedness in VR because no matter how far away something is in virtual space, your eyes only have to focus at ~2m.
True. I think there have been attempts to make headsets with dynamic focal distance, but to date none of them have been commercialized even at the high end.
VR headsets use lenses which focus the image at a long distance. It is like standing outside and looking at a distant hill, not at all like holding a phone up to your eyes.
It's a fantastic looking piece of hardware, but that price is hard to swallow given that PC VR has been on life support for years and there isn't really any signs of it making a comeback. The content still being made for VR almost exclusively targets Quest standalone as the lead (or only) platform since that's where >90% of the money is. Nobody aside from Valve can afford to make a "too big for Quest" title like HL:Alyx.
I think it's really worth keeping in mind that this is an ultra-enthusiast product: they're specifically targeting the niche of people who spend a huge amount of time in VR and probably in only a handful of the same apps every time (piloting/racing sims, VRChat, etc). The value prop here is effectively "this is a quality- and comfort-increasing upgrade to the thing you already spend all your free time doing".
Yeah, there's a market of people who are willing to spend hundreds or thousands of dollars for marginal improvements to VR. There are communities who spend about as much time in VRChat as they do in meatspace. And plenty of older folks who are into motorsport and aviation where the cost of a headset is trivial compared to running costs of a sports car or airplane.
Beat Saber is another big title for VR enthusiasts. For the past couple of years I’ve been using BS custom maps to add some consistently fun cardio to my days and have logged almost as many hours as I spent on WoW back in my high school and uni days.
Beat Saber runs perfectly fine on a standalone Meta Quest, so it is exactly not aimed at the "VR enthusiasts" who would invest a lot of money into PC VR devices like Bigscreen Beyond 2.
It does, but modding the Quest version is notoriously more fussy than modding the PC version is, and the heft of the Quest can get annoying when playing custom maps (most of which are vastly more physically demanding than the stock maps).
Not some of the effects heavy custom beatsaber maps. Also vivify hasn't been ported to quest beatsaber afaik so they're still missing out on all the crazy new maps with custom unity assets and environments.
Maybe, but I think it’s more of an idictment of game studios’ collective attitude toward VR and still-underdeveloped hardware (IMO, we’re only now starting to approach VR headsets’ “final form” with units like the Bigscreen Beyond and MeganeX).
Way, way too many game studios approached VR the same way they might a new graphics feature like raytracing or HDR. “Bolt it on and they’ll come” basically, which obviously didn’t pan out. Unsurpringly the titles that took flight were those where VR was integral to the experience and added lasting novelty — “AAA game but with VR” is just not that interesting (unless it’s a free community-developed mod, maybe).
Of course, it’s much more difficult and risky to develop VR-first games like that. With studios being more profit-driven and risk-averse than ever, that’s a recipe for vanishingly few new flagship VR games being developed. It’s a safer bet to develop yet another faceless loot box driven Fortnite/Counterstrike/etc clone or gacha game.
Or an indication that VR killer apps exist strictly inside NSFW domains(but that also implicates those elaborate driving/piloting simulator rigs are somehow NSFW too)
Bigscreen as a company isn't really looking to expand the VR market, it's just making an enthusiast-grade product aimed at people who know exactly what they want. Quest focuses on being a smartphone-like device with an integrated ecosystem, while Beyond is more of a PC peripheral for using Bigscreen the app, video content, social VR and some simulators. The audience for these things is small, but pretty dedicated, so it makes sense to offer a high-end solution targeting just them.
The VRChat userbase and the Flight/Racing Sim userbases are both thriving and very, very, *very* interested in continuing to have very good VR options; and both of those userbases have the money to keep companies like BSB and Vive alive.
I feel like some parts of the HN crowd are oblivious to the advancements and continued, impressive work that happens in the VR Space, just because a VR game isn't (for the time being) going to sell a million copies, but it will very comfortably sell to a 100k userbase that want to spend that money.
Valve had an early lead with the Index and has plenty of titles in its store that were designed for the Index. Then Oculus came along with the Quest and many of the titles were ported over there.
People who prefer Valve to Facebook, or who want a device-agnostic software library, might prefer buying from Steam. However, many of these games have been basically abandoned on Steam. Even though Quest ports exist, the versions on Steam may have controls that don't work with Quest controllers over Steam Link.
Bigscreen made their name as VR video viewing solution before pivoting to PCVR hardware. I’m surprised they haven’t shown off any simple video solution with Beyond 2 especially considering that’s the main use case for the Apple Vision Pro.
That said, eye tracking and IPD adjustment is a huge upgrade to Beyond 2 for their niche
>If Valve doesn't announce a new hmd by the summer, i might get this. the index is getting a bit too blurry and heavy for long sessions.
Not gonna happen. VR is a pure software play for them at this point, with what remains of the hardware having standardized on SteamVR. Index was nothing more than an experiment to help with SDK adoption.
Same. The Index is way too heavy for long sessions and the screen door effect is quite noticeable. I bought it in 2020 and haven't really touched it after playing some Half Life Alyx and Beat Saber. A headset that's only 107 g would be really amazing.
Super interested in this when the eye tracking is available but need some reviews for that first. For sit-down VR the price isn't really an issue for this, given how the GPU market is now anyway. It is a niche product for sure.
Foveated rendering with decent eye tracking could help us get out of the combo of high resolution / low framerates for clarity vs low resolution / high framerates for comfort.
Also, Valve seem to be ready to say something but it feels like it'll probably be (sensibly) a Steam Deck strapped to your head for stand-alone solution, and that's not the same market as this PC VR set-up is aimed at. Maybe if Valve take the 'store subsidy' hit and price it well and you can run a cable to it anyway..
The bit that's not obvious is how the low weight and short leverage of the headset affects lag.
When you are wearing a heavy headset that extends far from your face, it's not just the rendering latency and screen latency that affects the disconnect between your head movements and what you see. The headset physically lags behind your head motion because it has inertia. The total lag is the sum of the digital and physical lag. So, improving the frame rate can only get closer to the physical lag.
And, that's on top of the practically-instant pixel response of OLED vs LCD.
All that is to say that there are physical explanations for why 90 and even 75 Hz is better in practice than people would reasonably expect on the BSB. I can confirm first-hand. And, so have many reviewers.
Can anyone recommend what the best VR headset is if all I want is the appearance of some giant monitor floating in front of me? I don't think I need any kind of specialized controllers or head tracking or anything like that.
I'd just like to be able to pretend I have my monitor with me when I'm on a plane or space constrained in some way, and I'm happy enough to use keyboard and mouse without seeing them as my input options.
To be honest I don't think the best option for this is a VR headset, it's a pair of VR glasses. I use the Viture One glasses on airplanes, and the best part is they mostly just look normal on the face compared to something like a AVP or Quest. I think the new xReal One Pro glasses have a better FoV, but probably not enough for me to upgrade at this point. They can do "spatial tracking" where the display stays oriented in the same spot as a floating monitor if you move your head, or "static tracking" where the display is always fully in your FoV regardless of where you move your head. Both options plug in via usb C, and you can use them with anything that can do displayport over usb c (Steam Deck, Macbook, Android Phones, iPhones with an adapter, etc).
All modern VR headsets are going to have head tracking - even for something as simple as outputting a monitor signal, you need to have it occupy a point in 3D space to avoid motion sickness and to let the user naturally look at it up close, etc.
If you're going for no expense spared, the Apple Vision Pro is probably the best device that does exactly this. The Beyond 2 from this video is also good, but it only works in conjunction with a PC, it's not a computing device. Otherwise, there's the Quest 3.
Let's imagine you aspire to 1080p. AR glasses can more or less provide that as a 4 m screen at 2 m distance, with pov angles similar to a laptop screen. With head tracking, that can be a 1080p portal into a larger workspace. For 1:1 realistic motion, notice how far you have to move your head to point your nose at imaginary laptop screens above or to the side of your real one. Unrealistic motion[1] can reduce that, but I don't know if it's available off-the-shelf.
VR headsets trade pixel density for FOV. An Apple Vision Pro has a PPD similar to AR glasses in the center, but that's ~halved by the time you reach the edge of a 4 m 1080p screen at ~30 deg from center[2]. That blur makes eye motion across the screen less available. Try using a laptop screen while keeping your nose pointed at your focus. Head motion gives you greater apparent resolution by temporal supersampling - someone else will have to weigh in with Vision Pro experience.
Giant floating monitors may have been technically possible in non-glasses non-VR HMDs for a few years now. We're mostly limited by pixels-per-degree, and that by panel resolution, rather than optics. There have repeatedly been 2x or more higher res panels available in the sizes used in HMDs. But the HMD market is VR games, which wants large FOV for immersion, is cost sensitive, has optics limitations, and needs higher frame rates, which with GPU and bandwidth limitations (and eye tracking unavailable), makes resolution a "you couldn't use it anyway". One might put 4k panels in a DIY-ish non-VR box-like HMD... anyone know of existing work? A kickstarter? Not enough programmers on transit willing to wear big crude boxes perhaps.
Your options there are basically the Apple Vision Pro (UX designed specifically for that use case but it's awkwardly heavy and unbalanced), or the Quest 3 or Pico 4 (more comfortable and much cheaper, both will be much worse at it from a software perspective since their design effort is mostly as standalone video game consoles).
There's also the Xreal glasses, but those are much less 'virtual monitor fixed in space' and much more just having a screen strapped to your face that's tuned to an acceptable focal depth.
Steam is rumored to be releasing a standlone headset soon as effectively a Quest 3 competitor that can also play non-VR Steam games and presumably works with PC/laptop streaming, but there's no real info available yet.
I use a Quest 3 + Immersed software and code 6+ hours a day like this. One giant 50" monitor + 3 support ones on the sides. Best "monitors" ever.
I love that I have no eye strain at the end of the day.
It must be a Quest 3, not the 3S - it must have the pancake lenses, the old ones only focus well in the centre.
Do you have an opinion on how close the Quest 3 + Immersed can get to replicating an 8k 54" tv at 36" away? I know the VR panels are ~30 PPD (and there's other issues) and I'm asking for something that's ~60 PPD. But does blowing up the screen 50% work? (Or changing the aspect ratio, keeping the same pixel count and still scaling.)
MacOS just doesn't really play well with non-retina text anymore. I think windows/linux probably has clearer text at lower PPDs.
I can only describe my setup, have never tried an 8k monitor.
I am guessing I have about 40 PPD. I still have a sliver of space on each side where I see part of my support screens (vertical 1080P on each side). It has a 110 degree horizontal FOV.
My main screen is driven by the MacBook Pro's native resolution (3456x2160). Immersed can create 4k virtual screens, but I don't really see an improvement, the Quest's panels are the limiting factor.
Not crisp like retina, but the size makes up for it. In very rare cases I just lean right up to the screens (they are 3 feet from me). As the focal point is somewhere 4-5 feet away, you can have your face right against the screen to see tiny detail.
Also the fast refresh (I have mine set to 120Hz) matters. I tried my work's Vision Pro. Yes, more crisp graphics. But it gave me motion sickness, there is a slight motion blur, the pixels don't seem to refresh fast enough. The Quest is rock solid, never felt unwell using it.
I see. So it sounds like you have the MBP set to the non-retina setting then passed to quest via immersed? (ie if you look at the laptop screen IRL everything is very small) But in the headset it functions almost as though it was nearly a 50" 4k TV. So presumably has the macOS text issues but worth it otherwise.
And the mouse latency is fine wirelessly?
(Sorry, I don't love the retina word but I haven't found a substitute since high DPI resolves to so many other things.)
Exactly, tiny text on the main screen (if I un-dim it, Immersed dims it automatically). I am not aware of text issues, I guess I never noticed any.
There is a mouse emulation mode that is supposed to be more responsive, and native mouse - I use the native setup and don't notice any lag. Just have a good router and be close to it.
Retina is the only word I know :) because it started way back on iOS/iPhone before it was that common. And probably because I use a Mac.
One other thing that is an unexpected bonus. It rains where I live. A lot. And when it is grey and dim, I can sit in a "sunny office". In fact, I even tweak my day to the virtual environments. Sunrise in a mountain lodge in the mornings, sunny office or ski chalet in the afternoons. Very positively mood altering.
The simple answer is that they don't want to bother supporting that because people don't actually want it. Xreal is king of that product category and designed heavily around 'not looking weird', and still has only a fraction of the lifetime sales of VR headsets.
I tried this with rokid max: it was not usable for me, as it was too blurry. likely reason was my IPD (I think 71). You may read that this device can adjust IPD, but it's a software solution, not hardware. since I had no compatible mobile device (no current enough android version), I could not even test this.
I would assume not from screen density (even the Apple Vision Pro was rough for that from my attempts and it's got way higher screen density), but it'd definitely be way more comfortable to try it than anything else on the market.
> Does it work with macs/linux?
It's a standard SteamVR/OpenXR headset (though I don't know how the optional eye tracking plays into that), so the limitation should be your GPU more than anything else.
I'm told by co-workers that the way to use VR headsets for coding is to make the text size large, but to create multiple large text panels. With VR, you essentially have low resolution virtual "monitors" but they're very large and you can have 6+ of them open at a time.
Personally, it still makes me sick when I try to use it for that.
"Large" is underselling it. Even with the AVP, about 1.5x real size is the bare minimum for me for text reading comfort, and with other headsets it's more like 2x or larger. It very rapidly gets into territory where you'll give yourself neck strain trying to keep track of everything.
So far it's complete vaporware, and I don't see that changing. They're trying to FOMO people into buying their stock before they even have an actual demo unit working.
This sentiment is around, but I don't think it's accurate. There are working demo models, they recently gave an update on the delay (want to improve some small things like buttons and such after getting more devices from the line and trying them out)
All the demos I've seen are from people who say things like "it was really uncomfortable, and didn't have any of the hardware connected to actually make it a self-contained headset, and also rather than any of the real software, it was just running a video demo on the display". Has anything actually come out to the contrary?
Yes, there have been more updates since last fall. Like many, you seem hung up on that one review video that left out a lot of context to create their angertainment, people seem to prefer shitting on others than being optimistic these days
If anyone is just getting a black page make your window smaller and refresh. For some reason scrolling/rendering with a large viewport causes an undefined to bubble up and break the page.
Not sure, why they are displaying awful chromatic aberrations on their front page under the title "Advanced optics". That's not something to be proud of. If that was a result from a real camera lens, it would be a joke of a lens. Maybe fun for artistic purposes.
I understand this might be a view through some microscope maybe, but there is barely any magnification.
Chromatic and spherical aberrations for VR lenses are actually not critical, you can just split RGB channels and pre-warp in shaders with reasonable performance penalties. That was one of breakthrough ideas of Rift DK1.
Wonder if they asked Iowa State University researchers about their NSF-funded study that VR makes women and girls twice as likely to get sick, because Meta sure didn't.
I get zero motion sickness and I play exactly that, FPS with stick movement.
Very early on I realized that turning with the stick gives me nausea, but not moving back and forth. So I use the stick to move back and forth, and my own human body to rotate. Can play for hours with zero issues.
It's frustrating because the two common methods for reducing motion sickness shouldn't be hard to implement, even for lazy console FPS ports. Tunneling while the user turns, and teleportation controls. All VR games should have those accessibility options. A stable +90FPS framerate and the highest fidelity VR equipment also helps.
Granted, these are still not a silver bullet for motion sickness. A lot more research needs to be done in this field.
There's research that the difference happens at the hormonal level, i.e. it's probably not fixable on a certain level.
It's mind-boggling that the industry just generally isn't interested in looking into this. I asked five top ex-Meta folks about this for my book and they shrugged or didn't answer. You can't say VR is the Next Big Thing if it tends to make half the population literally want to spew chunks.
But what response were you expecting? Plenty of men face suffer from VR motion sickness too. It's not like privileged class is marking and closing the ticket as "not reproducing".
Is it half the population? Do a third of men and two thirds of women suffer motion sickness from VR?
If so, what kind of solutions do you imagine would be in order? The only things I can think of would be improvements to optics, resolution, frame rate, reduction of latency, better motion tracking, maybe reduce headset weight... which seem like the kinds of improvements that this company is working towards?
It's not just better hardware, though. As I mentioned, two very easy solutions to help alleviate motion sickness are purely software based. If you're just going purely hardware, we already know that 90FPS is a bare minimum to keep people from puking, along with controls that have you teleport to a spot instead of jerking forwards with your analog stick.
And having the most high fidelity headset you can get. But its improvements are marginal at best compared to what really needs to be done to figure out motion sickness if you want it to actually catch on.
I don't understand why this is news. The researchers freely admit they don't know the causes either (see speculation below), and frankly Meta's probably in a better position than them to collect additional data. They won't be unbiased, but they're certainly motivated to make their product be useful for as many people as possible.
> As for Danah Boyd’s speculation that the gender difference in VR nausea may have a hormonal component, he says there’s not enough data to answer that question, but there are some intriguing findings:
> “I do not know of any good studies on cybersickness and hormones,” as he puts it. “There has been some research on motion sickness and hormones, and sometimes we extrapolate (cautiously) from the motion sickness literature to cybersickness. For example, Golding, Kadzere, and Gresty (2005) reported that motion sickness is related to hormonal fluctuations during the menstrual cycle. However, they also note that the effect of hormonal fluctuations was much smaller than the gender effect itself, so it is not likely to be the primary explanation.”
> He believes any gender difference might be related to social differences, and less central to the overall challenge of overcoming VR nausea:
> “[S]ome of my research on the gender effect indicates that 1) the effect is relatively small, and 2) the effect is partially explained by differences in prior experience of visually-induced sickness (e.g., screen-based games, movies). It's certainly a topic worth investigating, but it's worth a reminder that there are vast individual differences in cybersickness susceptibility even within a given gender.”
(For the curious, this website I maintain about setting up a particular flight sim with VR documents all the meta-specific tweaks and caveats I've found: https://www.8492sqdn.net/guides/dcs/performance/)
There's also quite a lot of drama right now between Meta, Khronos and OpenXR: https://mbucchia.github.io/OpenXR-Toolkit/
BSB2 is definitely on my radar as a replacement for a Quest 2. No inside-out tracking is a bit of a letdown (I don’t relish the thought of setting up lighthouses) but a dedicated no-nonsense PCVR experience combined with the weight and bulk savings would be worth it.
2. With Virtual Desktop, I can remove my headset during a long flight to grab a snack or use the restroom, then put the headset back on and resume. With Meta's software, taking the headset off requires fully restarting both the sim and headset.
For #2... that sounds like some kind of a software issue that only applies to DCS? Or maybe some obscure issue with the headset software? When I take off the headset in its normal link mode, it will usually pause whatever's going on PC-side, but I can just press the power button and keep it running if that's required. Never had an issue with this, let alone something that requires restarting the whole headset.
2. I never figured it out, since Virtual Desktop fixed a bunch of my other issues too.
I was under the impression that the inconvenience of using a USB cable is compensated by the higher throughput/lower latency of using a direct cable connection vs. a theoretically more limited wireless connection. Does VD simply have a better compression algorithm than Link, or can it actually push through more data?
If you're not capped by bandwidth because you can hardware encode into H265 or AV1, it shouldn't make a difference (besides the bugs and software quirks in both programs)
I'm wondering if it would be worth just getting whatever adapter they come up with and the next gen Steam Deck to use the same way rather than investing in Deckard, but I'm interested in seeing their case!
https://www.youtube.com/watch?v=gbFU6KoEASU
https://www.youtube.com/watch?v=CpzZWTz1h0w
The killer app for the VR platform is eye contact.
In my opinion Zoom should be investing heavily in this, and Zoom’s market is the one newcomers stand to gain.
The giant feature that Real Life™ has and teleconferencing lacks is the subtle baton passing, engagement reading, and tone modulation that is enabled by eye contact. A square of video just isn’t the same.
My corollary hypothesis for why this hasn’t been done yet is that the Mark Zuckerbergs and John Carmacks of the world are on the autism spectrum, they get by without this information, and therefore don’t realize how much the rest of us rely on it.
For a regular "death by powerpoint" meeting where 95% of participants are just being bored by a single presenter, no, not for those. But for brainstorming workshops, yes definitely.
The avatars are cartoony but for me that doesn't matter. They represent the person for me after a small adjustment period.
It gave me a much stronger feeling of being together. And it's much easier to break out into little groups, much much easier than those 'breakout rooms' in teams which are pretty inflexible. Here you can just walk over to another group just in real life. It really felt like being there with them and after taking the headset off I had this feeling of suprise of being at home.
It was also great to be able to import 3D models of our products and discuss them.
I'm genuinely unsure if you're saying all this with a wink: we're into year 10 of VR Neue, it's been a tarpit for all who got in, and it sounds like an absolutely horrible idea for the video call company to dump money into VR for...eye contact?
Note the headset that supports this is $1200. And that's par for the course for eye tracking AFAIK.
I love VR and against my rational side, still invest in a premium headset every year two.
Price, historical market, corporate finances, and responsible stewardship aside, I absolutely cannot wrap my head around the idea that VR eye contact with an uncanny valley head replica and/or cartoon improves eye contact over...just seeing your eyes...so much that it's a killer app.
(my poor unfortunate Vision Pro and it's rapid dropoff in sales seem to indicate this very strongly as well)
https://blog.google/technology/research/project-starline/
I think the killer app is smart glasses for AR (non passthrough), and maybe virtual monitors with Immersed + Visor (when that ships), low likelihood imo still
Devices will not be mainstream until they reach a sufficiently small form factor
That's basically what the Apple Vision Pro is designed to be, in terms of UX and software. It just turns out we're still a long way away from the hardware being small and light enough to make that work.
This biggest concern I have about VR, especially for work, is that it forces you to spend too much time looking a screen that is very close to your eyes. This is known to cause myopia and digital eyestrain.
Do any VR headsets attempt to address this problem? Can a headset force your eyes to change focal distance either using the display, or more likely, a physical lens? Ideally the headset would slowly, but consistently, force you to change your eyes focal distance. Is that something that Eye Tracking would enable?
Also, does eye tracking work properly if your eyes are slightly misaligned?
All available VR headsets have a fixed focal distance, usually 2 meters, regardless of how close or far the virtual content is from your head.
Eye tracking is done independently per eye.
Way, way too many game studios approached VR the same way they might a new graphics feature like raytracing or HDR. “Bolt it on and they’ll come” basically, which obviously didn’t pan out. Unsurpringly the titles that took flight were those where VR was integral to the experience and added lasting novelty — “AAA game but with VR” is just not that interesting (unless it’s a free community-developed mod, maybe).
Of course, it’s much more difficult and risky to develop VR-first games like that. With studios being more profit-driven and risk-averse than ever, that’s a recipe for vanishingly few new flagship VR games being developed. It’s a safer bet to develop yet another faceless loot box driven Fortnite/Counterstrike/etc clone or gacha game.
I feel like some parts of the HN crowd are oblivious to the advancements and continued, impressive work that happens in the VR Space, just because a VR game isn't (for the time being) going to sell a million copies, but it will very comfortably sell to a 100k userbase that want to spend that money.
People who prefer Valve to Facebook, or who want a device-agnostic software library, might prefer buying from Steam. However, many of these games have been basically abandoned on Steam. Even though Quest ports exist, the versions on Steam may have controls that don't work with Quest controllers over Steam Link.
That said, eye tracking and IPD adjustment is a huge upgrade to Beyond 2 for their niche
Not gonna happen. VR is a pure software play for them at this point, with what remains of the hardware having standardized on SteamVR. Index was nothing more than an experiment to help with SDK adoption.
Foveated rendering with decent eye tracking could help us get out of the combo of high resolution / low framerates for clarity vs low resolution / high framerates for comfort.
Also, Valve seem to be ready to say something but it feels like it'll probably be (sensibly) a Steam Deck strapped to your head for stand-alone solution, and that's not the same market as this PC VR set-up is aimed at. Maybe if Valve take the 'store subsidy' hit and price it well and you can run a cable to it anyway..
On my Quest 3, I find 120Hz to be night and day compared to 90.
EDIT: their promo page says that 90Hz OLED feels like 120Hz LCD for VR.
When you are wearing a heavy headset that extends far from your face, it's not just the rendering latency and screen latency that affects the disconnect between your head movements and what you see. The headset physically lags behind your head motion because it has inertia. The total lag is the sum of the digital and physical lag. So, improving the frame rate can only get closer to the physical lag.
And, that's on top of the practically-instant pixel response of OLED vs LCD.
All that is to say that there are physical explanations for why 90 and even 75 Hz is better in practice than people would reasonably expect on the BSB. I can confirm first-hand. And, so have many reviewers.
[0] https://www.youtube.com/watch?v=gbFU6KoEASU
I'd just like to be able to pretend I have my monitor with me when I'm on a plane or space constrained in some way, and I'm happy enough to use keyboard and mouse without seeing them as my input options.
Viture: https://www.cnet.com/tech/gaming/viture-pro-xr-review-great-...
xreal: https://us.shop.xreal.com/products/xreal-one-pro
If you're going for no expense spared, the Apple Vision Pro is probably the best device that does exactly this. The Beyond 2 from this video is also good, but it only works in conjunction with a PC, it's not a computing device. Otherwise, there's the Quest 3.
VR headsets trade pixel density for FOV. An Apple Vision Pro has a PPD similar to AR glasses in the center, but that's ~halved by the time you reach the edge of a 4 m 1080p screen at ~30 deg from center[2]. That blur makes eye motion across the screen less available. Try using a laptop screen while keeping your nose pointed at your focus. Head motion gives you greater apparent resolution by temporal supersampling - someone else will have to weigh in with Vision Pro experience.
Giant floating monitors may have been technically possible in non-glasses non-VR HMDs for a few years now. We're mostly limited by pixels-per-degree, and that by panel resolution, rather than optics. There have repeatedly been 2x or more higher res panels available in the sizes used in HMDs. But the HMD market is VR games, which wants large FOV for immersion, is cost sensitive, has optics limitations, and needs higher frame rates, which with GPU and bandwidth limitations (and eye tracking unavailable), makes resolution a "you couldn't use it anyway". One might put 4k panels in a DIY-ish non-VR box-like HMD... anyone know of existing work? A kickstarter? Not enough programmers on transit willing to wear big crude boxes perhaps.
[1] https://x.com/mncharity/status/1225091755667853318#m [2] https://kguttag.com/2023/08/09/apple-vision-pro-part-5b-more... (but... caveat)
There's also the Xreal glasses, but those are much less 'virtual monitor fixed in space' and much more just having a screen strapped to your face that's tuned to an acceptable focal depth.
Steam is rumored to be releasing a standlone headset soon as effectively a Quest 3 competitor that can also play non-VR Steam games and presumably works with PC/laptop streaming, but there's no real info available yet.
MacOS just doesn't really play well with non-retina text anymore. I think windows/linux probably has clearer text at lower PPDs.
I am guessing I have about 40 PPD. I still have a sliver of space on each side where I see part of my support screens (vertical 1080P on each side). It has a 110 degree horizontal FOV.
My main screen is driven by the MacBook Pro's native resolution (3456x2160). Immersed can create 4k virtual screens, but I don't really see an improvement, the Quest's panels are the limiting factor.
Not crisp like retina, but the size makes up for it. In very rare cases I just lean right up to the screens (they are 3 feet from me). As the focal point is somewhere 4-5 feet away, you can have your face right against the screen to see tiny detail.
Also the fast refresh (I have mine set to 120Hz) matters. I tried my work's Vision Pro. Yes, more crisp graphics. But it gave me motion sickness, there is a slight motion blur, the pixels don't seem to refresh fast enough. The Quest is rock solid, never felt unwell using it.
I really don't even notice it is not retina.
And the mouse latency is fine wirelessly?
(Sorry, I don't love the retina word but I haven't found a substitute since high DPI resolves to so many other things.)
One other thing that is an unexpected bonus. It rains where I live. A lot. And when it is grey and dim, I can sit in a "sunny office". In fact, I even tweak my day to the virtual environments. Sunrise in a mountain lodge in the mornings, sunny office or ski chalet in the afternoons. Very positively mood altering.
They have better optics than xreal and emulating a 1080p on a higher res display will allow less edge blurring allow for a larger virtual monitor
I would assume not from screen density (even the Apple Vision Pro was rough for that from my attempts and it's got way higher screen density), but it'd definitely be way more comfortable to try it than anything else on the market.
> Does it work with macs/linux?
It's a standard SteamVR/OpenXR headset (though I don't know how the optional eye tracking plays into that), so the limitation should be your GPU more than anything else.
Personally, it still makes me sick when I try to use it for that.
So the actual number of pixels used to display the pane containing your text is maybe 10-20% of the headset display's resolution.
I suppose you could display text directly on the headset display. But then you only have 2,560x2,560 pixels per eye, fewer pixels than a 4k display.
Edit: fell for their lies, the headset is only 2k per eye. The 5k figure is "combined", i.e. a marketing number.
I'm with you though: early headsets were low resolution but these modern ones are not. We can still use more pixels though, esp. for text work.
(I know this is not encouraged by the rules but it’s one of the best places to report it :D please don’t upvote)
Not sure, why they are displaying awful chromatic aberrations on their front page under the title "Advanced optics". That's not something to be proud of. If that was a result from a real camera lens, it would be a joke of a lens. Maybe fun for artistic purposes.
I understand this might be a view through some microscope maybe, but there is barely any magnification.
What is this image supposed to be?
https://nwn.blogs.com/nwn/2024/04/vr-nausea-study-iowa-nsf.h...
Very early on I realized that turning with the stick gives me nausea, but not moving back and forth. So I use the stick to move back and forth, and my own human body to rotate. Can play for hours with zero issues.
Granted, these are still not a silver bullet for motion sickness. A lot more research needs to be done in this field.
It's mind-boggling that the industry just generally isn't interested in looking into this. I asked five top ex-Meta folks about this for my book and they shrugged or didn't answer. You can't say VR is the Next Big Thing if it tends to make half the population literally want to spew chunks.
If so, what kind of solutions do you imagine would be in order? The only things I can think of would be improvements to optics, resolution, frame rate, reduction of latency, better motion tracking, maybe reduce headset weight... which seem like the kinds of improvements that this company is working towards?
And having the most high fidelity headset you can get. But its improvements are marginal at best compared to what really needs to be done to figure out motion sickness if you want it to actually catch on.
> As for Danah Boyd’s speculation that the gender difference in VR nausea may have a hormonal component, he says there’s not enough data to answer that question, but there are some intriguing findings:
> “I do not know of any good studies on cybersickness and hormones,” as he puts it. “There has been some research on motion sickness and hormones, and sometimes we extrapolate (cautiously) from the motion sickness literature to cybersickness. For example, Golding, Kadzere, and Gresty (2005) reported that motion sickness is related to hormonal fluctuations during the menstrual cycle. However, they also note that the effect of hormonal fluctuations was much smaller than the gender effect itself, so it is not likely to be the primary explanation.”
> He believes any gender difference might be related to social differences, and less central to the overall challenge of overcoming VR nausea:
> “[S]ome of my research on the gender effect indicates that 1) the effect is relatively small, and 2) the effect is partially explained by differences in prior experience of visually-induced sickness (e.g., screen-based games, movies). It's certainly a topic worth investigating, but it's worth a reminder that there are vast individual differences in cybersickness susceptibility even within a given gender.”
People don’t get sick for real