
Mark Zuckerberg’s vision for the Metaverse was to reimagine the way we interact with each other and the world, providing an immersive world that could seamlessly combine digital and physical information.
Along the way, the parent company, renamed Meta, introduced headsets and began reimagining everyday computing through its Project Orion augmented reality glasses.
But now Meta is making deep budget cuts to its Reality Labs division, which could result in about 10% of the 15,000 employees working on Metaverse and related projects losing their jobs. Meta’s CTO Andrew Bosworth confirmed the staff cuts in a Jan. 13 memo.
But my colleagues and years of research show that this apparent U-turn is far from the end of the technology. But as it searches for commercial applications beyond gaming, it will likely signal a shift away from virtual reality (VR) and toward less immersive ways of integrating the digital and real worlds.
This augmented reality approach has already been realized with products like Microsoft HoloLens, which presents virtual information within an optically transparent display.
These augmented reality devices create the illusion that virtual information appears in a physical 3D space. Additionally, integrated hand and eye tracking technology allows users to interact through gestures and gaze.
The problem of virtual reality
After decades of research and development, VR technology has become a real product that undoubtedly meets real needs. The state-of-the-art headset provides users with an impressively immersive 3D experience with integrated powerful hand and eye tracking capabilities. In addition to gaming, virtual reality is used to train doctors, engineers, pilots, and more.
However, there are conflicts when it comes to more general, everyday applications. I and many others believe that with the advent of AI, we will need new interfaces beyond our phones to control and benefit from applications at work and home. At the same time, it is clear from our research that many people find VR headsets too immersive, unsettling, and impractical to use.
In a two-week user study in 2022, we compared working in virtual reality for an entire week (five consecutive days, eight hours each day) with baseline performance of the same task using a standard setup that included a regular display, external keyboard, and mouse.
In this study, we asked 16 volunteers to perform routine office tasks such as word processing, programming, and spreadsheet creation. The headline result was that users could work in virtual reality seven days a week, but there were many challenges to doing so.
Study participants using VR not only had higher perceived workload, but also lower usability, lower perceived productivity, higher levels of frustration, lower well-being, higher anxiety, more experience of simulator sickness, and higher visual fatigue. In other words, VR performed worse across all key metrics.
Despite these findings, interviewees noted that they could see themselves using VR once headsets became lighter and exposure to virtual reality was limited to a few hours at most.
A follow-up research paper in 2024 took a closer look at the imaging evidence collected in that study. Participants were shown what they did while wearing the headset: adjusting it, managing cables when they got in the way, lifting the headset halfway to eat and drink, answering phone calls, and rubbing their faces.
Our analysis shows that people are gradually becoming accustomed to VR headsets. Overall, by the end of the week, participants adjusted their headsets about 40% less often and took them off by about 30% less often.
This means that we can work in virtual reality just as we normally work with a physical desktop, keyboard, and mouse. However, if your VR setup is arranged to replicate a typical setup, the performance of VR will naturally deteriorate. We require virtual environments to perfectly replicate the physical work environment, which is not possible.
More importantly, it tells you about the trade-offs. Virtual reality provides a completely immersive virtual environment that transports users to a completely different virtual world. However, this must be balanced against negative characteristics such as poor ergonomics, nausea and fatigue.
superhuman strength
For any form of extended reality, from augmented reality smart glasses to something much more ambitious, to achieve mainstream success, it will have to offer more positive qualities than negative ones associated with devices we are already familiar with, such as laptops, tablets, and phones.
The solution, in my view, is to boldly reimagine extended reality as a medium that grants us superhuman powers, rather than as an implant or extension of the devices we already use in our daily lives. In particular, it allows us to seamlessly interact with computing systems in the 3D space around us.
In real life, you have to choose which tools to use. Just pick up a spray can and press the button to spray paint. You can click the spray can icon on the desktop interface and then use your mouse to apply spray paint. However, in extended reality, you do not need to first select a tool to use it. You can use tools with just a wave of your hand.
Simply hold your hand as if you were holding a spray can and press your index finger down to spray, and the system will automatically recognize that you want to use the spray can tool. You can then spray paint your digital item by pressing and controlling the virtual spray can button with your index finger.
Extended reality could also provide a medium for interacting with personal robots, for example by showing the robot’s future movements in 3D space in front of us. This will become increasingly important as artificial intelligence becomes increasingly embedded in our physical reality.
Ultimately, any vision for the metaverse (not just Zuckerberg’s version) will succeed only if it goes beyond the current user interface. Extended reality must accommodate the seamless blending of virtual and physical information within a 3D world.![]()
This article is republished from The Conversation under a Creative Commons license. Read the original article here.