- Software Development
- What the Metaverse Means for Developers – InApps Technology 2022
What the Metaverse Means for Developers – InApps Technology is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn What the Metaverse Means for Developers – InApps Technology in today’s post !
Read more about What the Metaverse Means for Developers – InApps Technology at Wikipedia
You can find content about What the Metaverse Means for Developers – InApps Technology from the Wikipedia website
Last month Facebook CEO Mark Zuckerberg outlined his vision for the metaverse, a 3D version of the web, in an interview with The Verge. We’ve had a flurry of metaverse news from other tech companies this year — everything from Microsoft’s “enterprise metaverse” to Nvidia’s “metaverse for engineers.” So the metaverse is the latest catchphrase in the tech industry. But what will it mean to be a developer in the metaverse? Will it be an open platform, like the web? Or will it be controlled by one or two companies, like the mobile app ecosystem?
Richard is senior editor at InApps Technology and writes a weekly column about web and application development trends. Previously he founded ReadWriteWeb in 2003 and built it into one of the world’s most influential technology news and analysis sites.
In practical terms, developing a metaverse application in today’s environment means developing a game or “experience” for any number of different platforms — everything from VR systems like the Oculus Quest and HTC Vive, to emerging virtual worlds like Roblox and Fortnite, to gaming consoles like Playstation and Xbox, to mobile apps and even the web itself. There are just as many developer tools to choose from, such as Epic’s Unreal Engine, Unity, Amazon Sumerian, Autodesk’s Maya, and the open source Blender.
So it really is a greenfield currently for metaverse development. Unlike with the mobile application ecosystem, there aren’t one or two dominant platforms — yet. Facebook would no doubt like to be one of those dominant metaverse platforms, eventually, although Zuckerberg claimed that “we’re going to be contributing to trying to build a more open and interoperable” system.
While we wait for the metaverse platform ecosystem to emerge over the coming years, I thought I’d take a look at three of the current initiatives that have had the label “metaverse” stuck onto them: Microsoft’s “metaverse stack” (announced at this year’s Build conference), Nvidia’s Omniverse, and the “Metaverse product group” announced by Facebook vice president of VP and AR, Andrew Bosworth, a few days after Zuckerberg’s interview.
In particular, I’ll look at how developers can get involved in those platforms and potentially help define what the future of 3D applications will look like.
Microsoft’s Metaverse Stack
During Microsoft CEO Satya Nadella’s keynote address at the Build conference in June, he talked about “a new layer of the infrastructure stack that’s getting created as the digital and physical worlds converge: the enterprise metaverse.” He referenced a “metaverse stack,” that enables developers to “build a rich digital model of anything physical or logical.”
What’s interesting about Microsoft’s conception of the metaverse is that it is very much modeled on the real world. There are no fantasy avatars in Microsoft’s metaverse — leave those to Roblox or Fortnite (or Second Life!). So-called “metaverse apps” in Microsoft’s universe will be underpinned by “digital twins,” defined in a separate post on the Azure blog as “rich digital models of anything physical or logical, from simple assets or products to complex environments.”
The ‘digital twin’ concept comes out of the Internet of Things (IoT) world, which gives a broad clue as to Microsoft’s intentions here. It wants to provide a platform to digitally map and monitor everything in a real-world business environment — warehouses, factories, retail stores, and so on. It’ll be like a 3D version of Microsoft Office.
From a developer perspective, as usual Microsoft covers pretty much everything — you can apply complex machine learning technology to digital twins, or build a simple application on top of digital twin data using Microsoft’s Power Platform (its low-code toolset).
Nvidia and the HTML of 3D
At this week’s online Siggraph event, an annual conference on computer graphics, Nvidia announced an expansion of its Omniverse platform. Omniverse launched back in March 2019 as “an open collaboration platform to simplify studio workflows for real-time graphics.” Basically, it allows engineers to collaborate on building a physical product, by working together on a digital representation of that product. So it has the same “digital twin” philosophy as Microsoft.
Omniverse is based on open source technology developed by Pixar, called Universal Scene Description (USD). In a presentation for Siggraph, Richard Kerris, vice president of Omniverse at Nvidia, described USD as “the HTML of 3D.” He added that many other companies, including Apple, support USD. “Like the journey from HTML 1.0 to HTML 5,” he continued, “USD will continue to evolve from its nascent state today to a more complete definition for the virtual world.”
During the presentation, Kerris positioned Omniverse as “connecting the open metaverse” — which suggests that Nvidia sees Omniverse as the 3D equivalent of a web browser.
“Users can portal in and out of Omniverse with workstations or laptops,” he continued, “allowing them to teleport into the environment with VR. Or they can mix with AR and anyone can view the scene on Omniverse by streaming RTX to their device.” (RTX is a high-end professional visual computing platform by Nvidia.)
Developers can learn about building “Omniverse extensions and microservices” at the Omniverse Developer Resource Center. There is a developer kit available, along with tips to get started building 3D scenes using USD.
Facebook’s Looming Presence
Unlike Microsoft and Nvidia, Facebook’s metaverse is currently vaporware. To be fair, that’s because Facebook only just announced its metaverse product group.
In his Facebook post announcing the group, Andrew Bosworth noted that two current Facebook products — Portal (a video calling device) and Oculus — can “teleport you into a room with another person, regardless of physical distance, or to new virtual worlds and experiences.” He described this as the type of “presence” that metaverses will require, but added that Facebook still needs to build “the connective tissue between these spaces.”
It’s too early to say whether Facebook will be able to provide that “connective tissue” (similar language to the term “social graph,” which Facebook popularized from about 2007 on). But since so many of us today use Facebook as our primary social network — at least for real-world family and friend relationships — it makes sense that Facebook would want to extend that social graph into the virtual world.
Don’t Forget the Open Metaverses
Developers can test out both Microsoft and Nvidia’s metaverse stacks today, or jump into more playful visions of the metaverse like that of Roblox (which has an active developer hub). There are also open source platforms out there, if you would prefer your metaverse to be non-commercial. Mozilla Hubs is browser-based, so you’ll be using web technologies like Three.js and WASM. The Open Metaverse OS is another open platform, this time tapping into the crypto trend (it uses NFTs, decentralized governance, and the like).
Regardless of which vision of the metaverse appeals to you, I’m certain we are headed for a 3D version of the web in the coming years. As with the rise of the 2D web in the 1990s and into the 2000s, this will present many opportunities for developers. So it’s time to teleport in and check out these emerging platforms for yourself.
Lead image by Sound On from Pexels.
Let’s create the next big thing together!
Coming together is a beginning. Keeping together is progress. Working together is success.