Is the idea of the metaverse dead? Even without a much hotter technology in the form of artificial intelligence (AI) capturing the public conversation, most ordinary people have stopped talking about it, beyond reminiscing about the COVID-19-era hype and catty quips at the technology’s expense.
The hype was once so loud that one of the biggest names in the technology industry capitalized on it by changing its own name — we’re looking at you, Meta, the company formerly known as Facebook. Several years on, however, Meta has quietly divested itself of its interests in the area. After losing more than $70 billion since 2021 as of December 2025, the firm was preparing to cut metaverse development outfit Reality Labs’ funding by 30%.
Recently, Meta announced it was shutting down its virtual reality environment Horizon Worlds in June 2026 — meaning the new paradigm we were all promised would have become one of technology’s infamous short-lived flameouts. Then, days later, it reversed course — with company representatives saying the platform would remain available on Quest, Meta’s VR headset.
Article continues below
The original iteration of virtual reality (VR) emerged in the 1990s, primarily for gaming, but the theory goes back further, with immersive digital worlds featured in science fiction as far back as the work of author Ray Bradbury in the 1960s. A real-world attempt at digital worlds, Second Life, made an early splash when released in 2003, but technical and copyright issues hobbled adoption.
The idea was resurrected by Facebook in the early 2020s. CEO Mark Zuckerberg believed we’d increasingly use shared, immersive experiences that go beyond flat screens as digital social interaction became central to life and work, and scientific studies confirmed users were ready, particularly after COVID lockdowns had made remote work the new normal.
But, several years later, things have changed. While Meta wasn’t the only company touting a new generation of VR, it’s a fairly damning indictment of the faith the technology industry has in this concept. But does that mean it’s dead and buried for good? Although it was a non-starter for many, for others, it’s not dead but simply still finding its place in the modern world.
“The branding is wounded, or under correction from its initial expectation,” said Lik-Hang Lee, assistant professor of augmented reality (AR) and VR at the Hong Kong Polytechnic University. He authored a study on the metaverse, which was published Oct. 21 in the journal Computers and Society. “The grand vision of a single, shared virtual universe where we all work, play, and socialize in headsets all day was always a bit of a sci-fi oversell.”
Futurist Mark van Rijmenam, who writes and speaks publicly about future technologies, agrees. He added that the vision of cartoon avatars in virtual reality (VR) lobbies was never going to happen. But he thinks the metaverse is very much alive nonetheless. “It’s maturing into something more meaningful than the hype once promised,” he told Live Science. “What felt like abandonment was actually a pivot beneath the surface. It’s being rebuilt with purpose, not PR, and with technology that’s actually ready for the spatial internet.”
The debate now isn’t about the metaverse being dead or alive but how the technological support will catch up.
Bob Gourley, CTO at intelligence and analysis firm OODA
There’s no denying that something went seriously wrong with this idea after Meta popularized it during and after the COVID-19 pandemic. Spending our time in virtual utopias, at least in the short term, did not enjoy wide appeal, and what little interest there was evaporated. When asked why metaverse proponents failed to convince the masses, Lee said there was no “killer app” for non-early adopters.
“VR meetups and virtual offices were marketed as the future, but for most of us, they were just clunkier versions of Zoom, Slack, or games we already had,” he said. “The friction of putting on a headset is high and the reward wasn’t clearly higher.”
For van Rijmenam, the metaverse that received all the hype also missed one crucial unique selling point — it was disconnected from real use cases. “It focused on virtual hangouts and flashy graphics instead of solving real-world problems or merging with everyday workflows. Early metaverse projects promised novelty before utility,” he said.
A common theme that experts brought up was that the hardware wasn’t seamless or mature enough to deliver the promised experience. “The current hardware suffers from limitations like a small field of view, heavy designs, motion sickness and poor graphics,” Bob Gourley, CTO at intelligence and analysis firm OODA, told Live Science. “The future of the metaverse lies in the hands of technologies in AI, 5G, edge computing, and display like microLEDs and better optics that are still to come before it can be fully realized. The debate now isn’t about the metaverse being dead or alive but how the technological support will catch up.”
Presenting a particular stumbling block were the VR headsets touted by Oculus (later Meta) and Sony (for use with PlayStation). Not only are they bulky, and much harder to set up and use compared to a laptop or phone, but reports of headaches and nausea were widespread, thanks to something known as a “vergence-accommodation conflict.”
There wasn’t a clear “why” — why do this in VR instead of just using a phone or laptop?
Lik-Hang Lee, assistant professor of AR and VR at Hong Kong Polytechnic University.
We focus on an object when the brain uses muscles to pull the eyes in different directions so that their combined focal point converges on an object, no matter how far away. But when you wear a VR headset, your eyes constantly focus on a small flat screen just fractions of an inch from your eyes, an illusion that works — but only up to a point. Prolonged exposure causes a contradiction between the visual field and how your brain directs muscles in your eyes to focus in response, a phenomenon that was central to a 2024 study in the Journal of Optometry.

“Interestingly, humans aren’t purely visual-first organisms,” said Jennalyn Ponraj, founder of Delaire, a research lab focused on voice and human nervous system regulation in AI systems. “Presence is actually regulated through interconnected systems that include vestibular balance, proprioception, breath, and timing. When you flood vision with high-resolution but low-latency input, the rest of the sensory system receives conflicting or absent signals, and it often results in fatigue, nausea, dissociation, and cognitive strain. The technology functions, but the models of human perception are incomplete. Meta’s divestment looks like an admission that immersion ultimately depends on attunement to biophysical regulation.”
This lack of comfort is something of an insurmountable barrier, experts suggest, with virtual worlds likely going nowhere until accessing them is as easy as putting on a pair of glasses and forgetting they’re there. We also can’t forget one of the most common barriers to uptake of any new technology paradigm: tech fatigue.
“A lot of people already feel overwhelmed by digital life,” said Lee. “Asking them to strap a gadget to their face for casual interaction is a big ask. There wasn’t a clear “why” — why do this in VR instead of just using a phone or laptop? Outside of gaming and some enterprise use cases, developers struggled to find sustainable business models and reasons for people to come back.”
But not everyone’s so certain of such manifest destiny for the metaverse. As long ago as 2023, a market research firm told the media: “The metaverse was briefly attractive to enterprises, but few invested seriously in moving the concept further within their organization.”
Even before that, when metaverse hype was still at fever pitch, a Pew Research study reported that 46% of respondents said the metaverse won’t be a refined, truly immersive and functioning aspect of daily life by 2040.
Peak usage was 200 people in the first week — but by the end of the first month, usage had plummeted to single digits. It simply took too long for customers to navigate the virtual space.
Saswata Baksi, co-founder of Local Glyph
Former software engineer at Facebook, Unity and Adobe, Mircea Dima, now currently founder, CEO, CFO and software engineer at AlgoCademy, thinks there’s little future in mass-market metaverse use. The limiting factor, he told Live Science, is human behavior — not graphics or computing power.
“Wearing a headset isolates you from other people, puts a physical strain on your body, and demands prolonged focus,” he said. “No one will wear a device on their face for an extended period of time for tasks that can be done just as quickly with a laptop or smartphone. Hardware getting to a point where it’s comfortable to wear on your head isn’t going to remove that friction.”
He added that the metaverse still has proponents because the concept is “emotionally appealing.”
“Engineers love working within a closed system where almost anything is programmable. Investors hate to admit they’ve thrown good money after bad and founders are often emotionally attached to a publicly promoted vision of what they wanted to create, and won’t easily abandon it.”
Games, virtual collaboration tools, industrial simulations and augmented reality on phones and headsets are all growing.
Lik-Hang Lee, assistant professor of AR and VR at Hong Kong Polytechnic University.
In one example, Mark Friend, Company Director at education IT provider Classroom365, delivered and ran VR classroom pilots. While such programs captured students’ attention, he said, the metaverse will never be mainstream in the field. He told Live Science the pilot data showed increases in the level of focus in students of up to 150%, but the model collapsed in the real world. The primary barriers to scale, he added, was hardware cost and the time to meet compliance requirements.
Saswata Baksi, co-founder of Local Glyph, tells a similar story, saying the metaverse is a non-starter because it solves a problem users don’t have. “A retail brand spent six months creating a virtual showroom for their customers to browse products as avatars,” he told Live Science. “Peak usage was 200 people in the first week — but by the end of the first month, usage had plummeted to single digits. It simply took too long for customers to navigate the virtual space. They preferred scrolling through Instagram on their phones, which offered no setup friction.”
As for why so many organizations continue to assert the metaverse will have its day, almost everybody Live Science spoke with highlighted the “sunk cost fallacy” phenomenon.
Many think the general philosophy of meeting, working and playing in virtual worlds in digital form is still coming. Lee said that if you strip away the hype, more and more of our lives are moving into persistent digital spaces: “Games, virtual collaboration tools, industrial simulations and augmented reality on phones and headsets are all growing.”
But perhaps the ultimate irony is that AI, which left the metaverse in the dust on the hype scales, will be one of the cornerstones that enable it. “The original metaverse failed because content creation and responsiveness were manual and static. AI changes that,” said van Rijmenam.
Joachim van der Meulen is the secretary and facilitator of DROPS Asia, an industry association to prevent harm from dropped objects in the workplace. He began building VR training content in 2017 and quickly learned “how difficult and expensive it is to deliver a genuinely good immersive experience,” but now he can see a way to rekindle the effort by using AI.

“Much of our effort went into maintaining core systems and keeping the software up to date, leaving little budget or capacity for new features or content,” he said. “AI-assisted development proved effective for routine, labour-intensive tasks, in some cases solving long-standing technical problems, and we’re now seeing experimentation with generative AI for 3D environments.”
Lee agreed, adding: “If anything like the metaverse is ever going to work at scale, AI will probably be a huge part of what makes it viable when it comes to content creation, intelligent characters and agents, personalization and usability and real-time understanding of the ‘world model’ and translation.”
We also can’t forget the roll call of technologies, from the Apple Newton to Google Glass, that only failed because they were too early — a position the metaverse finds itself squarely in today, with almost everybody we spoke to for this article agreeing it still shows potential despite a subpar early hardware experience. We can also see echoes of the Google Glass in the Ray-Ban Meta devices and similar AR-powered smart glasses today, which face regulatory and privacy hurdles.
“The metaverse feels like it’s in its middle phase,” said Lee. “The initial “this will change everything tomorrow” phase is over. What we have now is slower, more incremental progress. It’s a classic technology trough — less visible, less glamorous, but often where the most important engineering and design work actually happens.”
Even though Meta has cut funding and effectively neutered the idea, for many others, the underlying technologies and architectural foundations are now emerging, says van Rijmenam. AI, spatial computing, AR and VR hardware, as well as real-world physics models driven by spatial intelligence and real-time rendering, are all converging to provide a foundation for a future attempt to resurrect the metaverse. “What once felt abandoned was actually incubation,” he says.













