Building a Mirrorworld Vs. a Metaverse
While there has been significant hype about the Metaverse, there is an alternative being built that stands to transform the way humans live over the next 2 decades. The metaverse is a hypothesized iteration of the Internet, supporting persistent online 3-D virtual environments through conventional personal computing, as well as virtual and augmented reality headsets. Metaverses, in some limited form, have already been implemented in video games such as Second Life. Alternatively, the Mirrorworld is a 1-to-1 map of almost unimaginable scope. A mirror world is a representation of the real world in digital form. It attempts to map real-world structures in a geographically accurate way. When it’s complete, our physical reality will merge with the digital universe.
Inside the mirrorworld, agents like Siri and Alexa will take on 3D forms that can see and be seen. Their eyes will be the embedded billion eyes of the matrix. They will be able not just to hear our voices but also, by watching our avatars, to see our gestures and pick up on our microexpressions and moods. Their spatial forms—faces, limbs—will also increase the nuances of their interactions with us. The mirrorworld will be the badly needed interface where we meet AIs, which otherwise are abstract spirits in the cloud.
For the mirrorworld to come fully online, we don’t just need everything to have a digital twin; we need to build a 3D model of physical reality in which to place those twins. Consumers will largely do this themselves: When someone gazes at a scene through a device, particularly wearable glasses, tiny embedded cameras looking out will map what they see. The cameras only capture sheets of pixels, which don’t mean much. But artificial intelligence—embedded in the device, in the cloud, or both—will make sense of those pixels; it will pinpoint where you are in a place, at the very same time that it’s assessing what is in that place. The technical term for this is SLAM—simultaneous localization and mapping—and it’s happening now. See how Niantech’s 6D.ai is developing AR apps that can discern large objects in real time.
EVERYTHING CONNECTED TO the internet will be connected to the mirrorworld. And anything connected to the mirrorworld will see and be seen by everything else in this interconnected environment. Watches will detect chairs; chairs will detect spreadsheets; glasses will detect watches, even under a sleeve; tablets will see the inside of a turbine; turbines will see workers around them.
The next major paradigm shift that will unfold over the next two decades. This shift layers the digital world that exists today, like Internet of Things, 3D models, SLAM, and digital mapping, onto our physical world. Read more in this Wired Article.
One Comment Add yours