What once began as a revolution of communication in the form of text, images, and two-dimensional formatting is now evolving into new dimensions. The metaverse has been with us in one form or another for some time, but the progression of critical technologies has vastly pushed the envelope of how we define and use the web. As a business owner looking to remain competitive in the future, you need to stay informed on the technologies involved in this space if you are to develop applications and even entire digital worlds for your customers.
Let’s talk about the technologies that are used in metaverse development and how businesses can create their own metaverse applications.
Neal Stephenson’s Snow Crash coined the phrase that became the name for the metaverse, and executives like Mark Zuckerberg are ironically fulfilling the book’s prophetic predictions about the future of business, the economy, and society. The metaverse, as it is understood in modern business, is an immersive next step of the Internet where users participate in virtual, immersive worlds.
It’s clear that the future of the Internet will arrive sooner rather than later. According to Bloomberg, metaverse technologies will be an $800 billion market this year in 2022. It’s important that if businesses want to remain relevant over the course of that evolution, they need to stay on top of the technologies driving the development of ‘web 3’ forward.
There are a number of different technologies that are powering metaverse development. Understanding each of them and recognizing where they intersect is critical to innovation.
Virtual reality is one of the most prolific technologies moving the metaverse forward, but it has several limitations. Since these are emerging technologies, the world isn’t quite ready for widespread, easily accessible VR. First of all, this is due to limited mobility. In order to experience virtual reality, the user needs a special VR headset, which is quite bulky and doesn’t allow the head or body movements needed to get the full experience. Moreover, the cost of VR devices is quite high, and there are no standards for virtual reality yet, so content created for one platform may not work with another.
Augmented reality is a more accessible technology that is breaking the boundaries between the real and digital worlds. It’s available on nearly all modern smartphones, making it a viable option to support a freer and more open metaverse. Even though AR doesn’t provide a fully immersive experience, overlaying images and information over the real world may be a more effective strategy for companies this early in the game.
Andrew Makarov
Head of Mobile Development
Realizing this, in 2022, Meta introduced additions to its Spark AR platform such as voice effects, improved depth mapping, and effects blending to help developers create metaverse applications.
AR Spark’s Audio Visualizer feature for audio integration in your effects
Source: Spark AR
Features like this that allow AR effects to respond to sound can help you to create an even more engaging environment.
Another way that augmented reality can be used for more immersive experiences overlaid on top of the real world is the ‘try before you buy’ retail strategy. Through the use of virtual fitting room technology or similar technologies, shoppers can try on items virtually with AR before they decide to make a purchase. IKEA Place is a great example of how this functionality extends to furniture and interior design, allowing shoppers to place virtual furniture in their house to see how it may look before they buy it.
Extended reality technologies like AR and VR have many limitations, but developers can overcome them with yet another advanced technology. This technology is artificial intelligence.
Creating lifelike 3D spaces, delivering more realistic experiences, and running complex calculations necessary for AR face tracking and other tasks are best handled by AI. Artificial intelligence can complete these tasks more efficiently than humans in far less time. Self-supervised learning will greatly increase the efficiency of AI-powered systems.
A subset of artificial intelligence that is important for the development of the metaverse is natural language processing. This is an advanced way for AI to interpret and emulate human speech. Not only will this be a great way for users and AI to interact in the form of customer service chatbots and virtual assistants, but it can also make the metaverse more accessible for diverse groups of people. For example, conversational artificial intelligence enables rich real-time language translation, even though it’s a challenging task.
In fact, there is a non-monotonic relationship between the source speech and the target translations. It means that the words at the end of the speech can influence words at the beginning of the translations. So, there is no real real-time speech translation because there is always the need to check the translated text's consistency against the original speech (so-called re-translations). There is always a small delay even if you can’t see it. Therefore, you need advanced algorithms to stabilize the translation of live speech, as Google does in its Google Translate to reduce the number of re-translations.
Internally, real-time speech translations may be organized as follows: user says something, the user’s speech turns into a text, and the text is then translated into the other language. After the speech is paused/ended and the final re-translation is done, the text turns into a speech using speech-to-text technologies.
NLP also can provide live captions for users with hearing impairments. For example, AI technology can instantly transcribe the conversation of a group of people, making communication within a metaverse application accessible to users with hearing disabilities.
NLP also makes digital voice assistants and AI avatars possible. These can help users with hands-free operation of their devices, as well as targeted suggestions. Meta is already developing a voice assistant that will be used in metaverse applications in the coming years.
Virtual assistants can perform language translation, financial management, and much more.
The representation of users in the metaverse as AI avatars or digital humans also relies on NLP. Conversational AI allows avatars to process and understand human language as well as respond to voice commands. Last year NVIDIA introduced the Omniverse Avatar avatar modeling platform. It allows you to create virtual versions of people who not only recognize speech but also capture emotions on the faces of users.
One important role that virtual assistants can help with in the metaverse is customer service. Since shopping experiences in the metaverse will be highly immersive, conversational AI will be very useful for giving shoppers the opportunity to ask virtual customer service avatars about the characteristics of the goods, payment terms, discounts, and the like.
Computer vision can enable machines to better create digital copies of objects, recognize images and patterns, and even recognize the expressions and moods of users.
One of the limitations of VR and AR experiences is control. Hardware controllers, gloves, or other kinds of physical devices can be used to input into the device. However, computer vision can help make this experience more natural by using hand tracking. By recognizing gestures and finger positions, users can interact with their devices more naturally and freely.
https://www.youtube.com/watch?v=9sUsWwz5B7U
AR hand-tracking demo by MobiDev
This is one of the scenarios of how it can work. AR implementation includes the coordination of both the cell phone’s video camera and LiDAR. The video camera captures the image/video of the real world and the user’s hand. LiDAR estimates the distance between the real-world objects and the user’s hand. With that information we can correctly place some virtual objects on the phone screen, so from the user’s perspective, the virtual object looks like a part of the real world.
With the help of computer vision technologies, we can recognize if the user tries to interact with the virtual object with the hand. Examples of such interactions can be putting the virtual object to a cart in a virtual shop or animating objects (useful for AR-interactive games).
Computer vision in the metaverse doesn’t just stop there. ReadyPlayerMe uses face recognition to create a virtual avatar from a user’s selfie. Most video games and platforms require users to create a brand new avatar to use for each service. However, these avatars created by computer vision are designed to be used across thousands of different platforms.
Given that users interact with the metaverse in the form of digital avatars with bodies, it’s important that the posture of those characters be accounted for. Human pose estimation (HPE) uses motion sensing devices like controllers, gloves, and more to accomplish this. HPE recognizes body parts and their positions in an environment, while another practice called action recognition can identify more complex interactive activities like grabbing items or pushing buttons.
Human pose estimation technology in action
Source: MANUS
Knowing the position of one’s limbs is only the beginning. Taking account of hand gestures, gait, eye tracking, and even human expression can be recognized to improve the system. Using human pose estimation technology, users can synchronize their motion to a chosen avatar and dive into the digital world.
Artificial intelligence is just a part of the metaverse story and it’s usually not the answer to every problem that developers face when making metaverse projects. AI needs high quality data, and that data needs to come from somewhere. Internet of Things devices and sensors are critical for providing high-quality real-time data to AI systems for analysis.
One of the most useful applications of IoT in the metaverse is digital twins. This technique utilizes IoT sensors to create a digital version of an environment or system. With VR relying heavily on virtual environments, being able to create a virtual representation of an environment using sensors is in high demand.
The metaverse is not simply a new, digital world. It is the intersection and seamless crossover between the real and digital world. Using augmented reality technologies with IoT sensors to bring the real world into the digital, and the digital into the real world will revolutionize metaverse technologies.
As a global and decentralized system, blockchain platforms are in demand for use in metaverse projects. Centralized data storage is problematic in the metaverse because of the barriers to the flow of information. A more open solution like a blockchain can allow for a more fluid flow of information and proof of ownership for digital assets. Due to this, there is high demand for the development of systems that can support cryptocurrencies and non-fungible tokens.
Today, non-fungible tokens (NFTs) are the most promising way to develop the metaverse economy. Since each token is unique, it can be a reliable proof of digital ownership recorded in the blockchain. For example, users can buy in-game assets and digital real estate in the form of non-fungible tokens representing the right to own these items.
With the metaverse relying heavily on virtual worlds, 3D modeling is a skill that’s in high demand. From decorating homes to creating skins for avatars, modeling is something that virtual worlds can’t do without. With such a large number of objects that need to be digitized, it’s clear why IoT sensors need to be used to create digital twins of environments. Large databases need to be made of real-world objects that have been ‘3D captured’ and digitized.
However, there are challenges to the digitization of the real world. The higher resolution that an object is digitized with, the greater the memory it will use. Finding space for all of these objects and rendering them on lower-end hardware isn’t always possible. This is especially challenging for VR support. VR experiences have to be rendered at higher framerates to maintain immersion. However, if all the objects in a scene have very high poly counts, then performance could take a hit. Managing this is critical for providing successful metaverse experiences.
As we continue into the wild west of the development of metaverse and web 3.0, there are countless opportunities for ideas to flourish. Whether these ideas ride the wave of disruption that these technologies bring to the table or extend the potential of your business to reach new markets, keeping your business competitive is a must.
For instance, your team wants to build an immersive retail store based on metaverse technology. To build this, the project would need a fully 3D VR environment and objects to serve as products. This virtual store would also need digital customer service agents to help users find what they need.
The metaverse use cases don’t stop there. It could be a virtual meeting room where you and your colleagues can interact as avatars. Or imagine how cool it would be if a designer could place decor, and change the color of walls and furniture right in the VR metaverse. Modern technology makes all this possible.
The full article was originally published here and is based on MobiDev technology research.