After $177 billion investment, why do Metaverse charts still suck?

Almost 20 years have passed since the release of Second Life, a first attempt at an immersive multiplayer universe from Linden Labs, where people began to live and work, earning big bucks along the way. Two decades later, the promise first hinted at in Second Life is getting closer to reality, as the persistent digital world of metaverse is beginning to make inroads into the mainstream.

The jaw-dropping coverage and endless hype of the Metaverse would convince the average person that they’ll have to start planning a life permanently strapped to a VR headset.

A billion of us are expected to enter the Metaverse by the end of the decade, if Mark Zuckerberg has his way, while research bank Citi says the Metaverse industry will support an economy that could be worth between 8,000 and 13 trillion dollars by the same date. It’s jaw-dropping numbers like this that have attracted more than $177 billion in investment into the metaverse since the start of 2021, according to McKinsey.

There’s only one problem: the graphics on the platforms advertised as being at the forefront of this future look about the same, if not worse, than the 20-year-old Second Life.

When Meta announced the launch of its Horizon Worlds metaverse platform in France and Spain this week, it was met with widespread mockery. The brunt of the criticism was borne by CEO Mark Zuckerberg’s “dead-eyed” legless cartoon avatar, forcing a hasty redesign.

Meta hastily rolled out an updated Mark Zuckerberg metaverse avatar. Image: Meta

It is not only the players inherited from big technologies that are affected. Web3 metaverse platforms such as Decentraland have also been criticized for their graphical styles.

Decentraland’s “relentlessly flat” terrain. Image: Decentraland

DecryptDecentraland’s own review was aimed at its “relentlessly flat” plot and popup. “Even on the highest settings,” our reviewer said, “it’s too graphically limited to be a particularly engrossing VR experience.” CryptoVoxels, the sandbox; they’re all rendered in blocky, cartoonish visuals reminiscent of a vintage game from the 2000s.

great void

All of this begs the question: why are the graphics so terrible in the Metaverse?

There are many reasons why this could be the case, with different platforms offering different excuses depending on the graphical fidelity they offer.

One of the main problems currently facing metaverses is that real-time graphics rendering requires a lot of processing power and super-fast internet speeds which are not always available to users. Graphics cards and broadband connection speeds limit the ability of metaverses to present very detailed graphics, which means they often rely on larger brush graphics.

The sandbox. Image: Decrypt

Metaverses often have worse graphics than MMO games because they are, by design, much more open. Rather than allowing users to simply follow a pre-programmed list of commands, which games do, the Metaverse theoretically allows for an infinite number of options that cannot be pre-rendered and invoked when needed.

There’s also the suggestion that having a totally cartoonish metaverse is better than the alternative: a mostly realistic environment with a few fatal flaws.

The concept of the strange valley, where the graphics are almost perfect but have a flaw that annoys users, already exists in video games. And in an environment where you’re rendering things in real time and giving users the ability to make almost unlimited decisions, there are just too many variables that could go wrong and push people into the strange valley.

A problem with the legs

The issue is particularly tricky when it comes to legs.

For Metaverses built around VR interfaces, the legs are “super tough and fundamentally not achievable just from a physical perspective with existing headsets,” said Andrew Bosworth, then vice president of Reality Labs at Meta. , and now its chief technology officer. CNN Business in February.

“It’s a hardware issue,” says Gijs Den Butter of SenseGlove, a Dutch company that develops haptic-feedback gloves and devices that will be an important part of the metaverse, if we eventually inhabit it. “The makers on this occasion have a headset, which has controllers or hand tracking, and that’s what our computer is for the metaverse,” he says. “As it stands it doesn’t have legs, because the hardware can see your hands and maybe your arms, and track that, but when you look forward you can’t see your legs. “

It’s tricky because the body-tracking algorithms that help identify where you’re pointing in the metaverse require input from body parts they can see – and as anyone who stands up straight and looks directly ahead of them knows, you don’t see your own legs. Therefore, computers trying to render the digital equivalent of your body in the metaverse have no legs.

That’s less of an issue for crypto-based metaverses like Decentraland and The Sandbox, which rely primarily on browser- or desktop-based interfaces rather than fully immersive VR, for now.

“It’s really Facebook/Meta and Microsoft, these immersive platforms,” ​​which don’t have avatars with legs, says Weronika Marciniak, Hong Kong-based metaverse architect at Future Is Meta. “Most worlds, like VRChat, Decentraland, Sandbox and others feature avatars with legs, although you don’t necessarily have sensors with legs.” These rigs get around the problem by “faking it” – before Marciniak corrects himself by “assuming the position of the users legs”.

Den Butter says the lack of legs in major mainstream metaverse platforms is not due to a lack of processing power. “Legs, like all moving parts, are basically built from a kinematic model,” he says. “The mathematical models of the hands are quite heavy, but for the legs it’s just a few points that need to be covered.”

He says existing low-end hardware like Azure Connect or a Wii Camera could process the relevant data points, which means transmitting and processing that data to render it in the metaverse, either locally or through the Internet. edge computing, probably won’t cause too much of a lag. .

Instead, he and Marciniak blame the lack of legs on hardware limitations, and specifically the lack of visibility of existing head-worn devices.

That should change soon, however. In December 2021, sneaker company Nike bought RTFKT, a move that Marciniak says could be the first step toward headset-like controllers for our feet. “They may be working on actual shoes or socks with sensors that would be connected to VR headsets,” she speculates.

Take it to the other side

One metaverse unlike any other is Otherside, from Bored Ape Yacht Club Yuga Labs creators. Built around Improbable’s M2 engine, Otherside looks like it belongs in 2022, which is no small feat, according to those who designed it.

“We don’t just throw a platform in our partners’ way,” said Rob Whitehead, co-founder and COO of Improbable. Decrypt. They engage with partners on what they want from the metaverse and design it. “There are some amazing projects out there, but it looks like you took an app and tried to make a metaverse out of it,” he says. “It looks like it’s classy, ​​but we’re just more about taking game-like experiences and making them more playful and metaverse.”

Improbable hours spent researching and developing its M2 engine to enable it to render tens of thousands of unique characters using machine learning techniques that push processing onto users’ GPUs, rather than send the data via the cloud. “The problem is that if you double the number of people in a dense space, you quadruple the amount of data you need to send,” says Whitehead.

Whether other Metaverses will rethink their approach to visuals is a whole other question. But it’s something that’s likely to become an increasingly pressing question, if the Metaverse is to achieve the mainstream adoption its proponents desire.

Stay up to date with crypto news, get daily updates in your inbox.