Can you imagine talking to a person of another nationality, each in a different language, thanks to real-time translation? Or building a virtual world with voice commands? These are some of the possibilities that Meta believes will be a reality in the metaverse.
In its own event on artificial intelligence, the company led by Mark Zuckerberg has presented some of the technologies that will become the axes for the development of its virtual world. The idea is that it can be easily entered, built and interacted with.
A 'Bot Constructor' and advanced voice assistants
The first idea presented this afternoon was a "Bot Constructor". It is a tool with which, according to Meta, we can build worlds in the metaverse using only our voice. Although it is in an early stage of development, the company made a brief demonstration.
Mark Zuckerberg appeared represented in a virtual environment with his avatar. The place was completely empty, except for the guide lines of the ground and the sky. To demonstrate how voice commands work, the executive said "let's go to the beach." Immediately the sea and the sand appeared.
He and a teammate played to design their own virtual world with elements such as a table, palm trees, clouds, and even played music. All with voice commands. Mind you, the graphics, overall, in this demo seemed much lower resolution than what we've seen before.
Bot Constructor's ability to understand natural language stems from a larger and more ambitious meta project called CAIRaoke. This works with an end-to-end neural model that seeks to enable "personal and contextual" conversations.
It is not a new technology. Meta is already using CAIRaoke in its Portal products. However, in the future they plan to continue improving it to integrate it into augmented and virtual reality devices so that the assistants of the future provide "deep interactions".
But what exactly would a natural language request look like? As an example, Meta says something like this: "Mute all notifications for the rest of the day, unless it's my mom calling." Or something more complex like, "Can I rent the local community center for a private party?"
Translators in real time
Meta says that the metaverse is the next evolution of social media where we will connect with people from different parts of the world. Part of the company's approach to its virtual world is to eliminate language barriers through real-time, machine translation.
In this sense, Meta says that it will develop in the long term two tools that will allow the inclusion of "most of the world's languages". One is an advanced AI tool that will seek to deliver quality translations with less contextual data from Asturian to Luganda.
Another is a universal translation system that will work with spoken translations. "Imagine, for example, that people in a market who speak different languages can communicate with each other in real time using a phone, a watch or glasses," says Meta.
The truth is that we will have to wait, some cases longer than others, for these technologies to reach the metaverse, which is also on the way. Meta must overcome many significant challenges, including developing large-scale AI models.
In Xataka | Meta says it will have the world's most powerful AI supercomputer by mid-2022: 5 exaFLOPS to power the metaverse