Mark Zuckerberg is building the tools he needs to power his metaverse, and human-level artificial intelligence is his primary goal. Experts have questioned whether the technology available today is capable of processing the massive amounts of data the metaverse needs to run. So, last month, during the Meta event, "Inside the Lab: Building the metaverse with AI," Zuckerberg said that artificial intelligence is how the company plans to pull off the metaverse-big-data trick.

But it's not just AI Meta is building. Recently, they announced the development of a new AI supercomputer, the Research SuperCluster (RSC). The company says it will be the fastest AI computer in the world. Smart glasses, VR-AR headsets with chips, 5G, and supercomputers are the tools Meta will use. But the stars of the show, the great digital illusionists of the metaverse, will be human-level AI systems.

Related: AI Researcher Says Big-Tech Is Doing It All Wrong

Tech Talks reported that the top Meta Chief AI scientist, Yann LeCun, believes human-level AI can be created using an approach called self-supervised learning. The concept is linked to how humans develop intelligence. While this is still hypothetical, he's exploring how computers can learn from observation. But this isn't Meta's first entry into self-supervised learning. It has built an AI system that can digitalize real objects in Horizon Worlds, Meta's entry point into creating the metaverse. The company now looks to go even further with this nurture versus nature model.

Human-Level AI: Nurture Versus Nature

Illustration via Meta
Illustration via Meta

Learning begins at a very young age for humans with nurture and nature. Nurture learning is done by parenting and early education. And nature learning starts at young ages by creating a model of the world by observing nature. In AI, nurture would be the human input, the coding, and programming. And nature learning for an AI would require it to be independent enough to observe the world and learn by itself.

The "world model" concept is not new, but LeCun believes this approach is the key to reaching human-level AI. So, how are world models created? Humans develop a "world model" during the first few months of their lives. For example, they learn about gravity when they see how a ball or one of their toys falls. Then, they slowly create a virtual inner world model with laws of nature, in this case, gravity. This world model will tell them that any object will fall just like the ball, or the toy, without needing them to drop it.

Humans develop common sense, abstract thinking and other complex social concepts with this world model. Meta says they are breaking ground with this approach of self-supervised learning. The company has already created an AI concept, BuilderBot, that would generate or import real things into the metaverse by voice commands. The company is also working to develop AI avatars that speak any language and will guide, build and relate with all the users connected to the metaverse.

Next: Who Is QAnon? AI Might Have Just Figured It Out

Source: Tech Talks, Meta News