Zone’s Technical Director, Mika Tasich, takes a deep dive into the definition of the Metaverse and what it means for the future.
This past year was dominated by the notion of Metaverse — a magic, transformative technology that will change the world. To put all my cards on the table at the outset: I love Metaverse. I love space travel, teleports, and wormholes too. I am generally a sucker for a good sci-fi concept, and the closer to reality it is, the more excited I get.
I also have to admit I’ve been around for a while now. I’ve ridden this Spaceship Earth fifty-odd times around the Sun, and Metaverse was a very familiar concept to me for more than half of that time. I knew it before it even had a name. If I remember correctly, I first came across it in Neuromancer before I started growing facial hair. I wore my first VR headset and coded my first VR “experience” years before I enjoyed Snow Crash and was introduced to the actual term ‘Metaverse’. I even tried to build a Metaverse of my own before the first tech bubble burst.
I am also an engineer — the kind that builds digital stuff and, stereotypically, likes things to be well-defined, neatly explained, and universally understood.
Frustratingly, Metaverse is anything but.
I am not trying to claim that I am the expert here. Nor am I trying to say that I know what the Metaverse is and that my vision is right and everyone else’s is wrong. I am just trying to convey the fact that I’ve been thinking about it longer and deeper than most people, and this is the insight I would like to share.
I am also aware of my biases. Well, some of them, at least. I am deeply disappointed — you can say almost bitter, at the state of the Metaverse today, especially the current vision served to us by Meta. You know, the disappointment you get when a massive Hollywood studio takes your favourite book and butchers it? So instead of ranting about it, I approached this more constructively.
Define the Metaverse.
But how do you define an abstract, nebulous concept devised as a plot device in a sci-fi novel? Being a techie, I steal approaches that work, and nothing beats the axiomatic system when it comes to putting a solid definition on an abstract concept. It has been used to define the most abstract of human inventions — mathematics — for thousands of years, so it should be more than adequate for clarifying what Metaverse is.
Before we dive into the axioms, just a quick reminder:
💡 The axiomatic system defines notions through a small set of unquestionable rules (axioms). Even though rules themselves are unquestionable, everything else has to follow the logic these rules impose. For example, Euclid wrote five simple statements even a child could comprehend two and a half thousand years ago. If you follow the implications of these five deceptively simple statements, you get the whole field of Geometry, and these axioms stand firm even today.
Using this method to try and nail the concept of the Metaverse, I assembled my axioms defining what the Metaverse means for me. For me, the Metaverse is:
- A technology — It is not a concept as, let’s say, mindfulness. It is a tangible piece of enabling technology such as aeroplanes, binoculars, or spreadsheets.
- Social, with presence — It enables real-time communication between one or many individuals, where each individual has a manifest presence.
- Sensory — The Human–computer interaction (HCI) is sensory and direct.
- Irreplaceable — It provides solutions which are impossible or very impractical to achieve without it.
- Placeless — Its usefulness is universal and it is not determined by location. For example, it is not a tool that is only useful at home and not in the office.
Armed with these, I set out to find the most Metaverse experience I could. Something I could point people to and say, “This! This is Metaverse.”
I started by jotting all the usual suspects in a table:
And then I stopped. It quickly became apparent that none of these, or any of the rest I planned to examine fully, do not tick the two most important boxes. None offer anything I cannot do better in the real world or using some other technology or solution.
So, I cast my net wider. I went looking for something that matched my axioms but that may not be labelled “Metaverse”. And I found it. I know that people who made it have not used the M word anywhere. I am also pretty confident that you will think that I am crazy when you see their promo video, but give me ’til the end of the blog to change your mind. I hope to persuade you not only that this application represents Metaverse but that it offers a glimpse into the future we will all share.
Here it is. The most Metaverse application out there is The Nanome. If you don’t know it, it may be worth spending a minute and a half watching their promo video.
To understand why I chose Nanome, I’ll start by focusing on the two axioms eluding everyone else and where it excels. Why is it irreplaceable and placeless? In keeping with the mathematical theme, I will show you my working out first, then my proof.
The Interface problem
We are so used to seeing flat 2D images that we don’t think about it too much because it works well for all of our needs. However, this is the thing with transformative technologies — they change our needs and our expectations.
Before the fire, we did not need cooked food. Then we invented a pot, externalised our stomachs, and now we cannot live without it. Before smartphones, we did not need to reach anyone on the planet without knowing where they were or to be able to extract any fact regardless of where we were. But now? We have externalised our minds into these handheld wonders, and we cannot live without them. Remove the smartphone, and our society will implode.
So, here we are. Well-fed on cooked food, likely reading this on a 2D screen of your smartphone, and you are probably wondering what am I talking about.
Look at this image, and see if anything strange strikes you.
Not much, right? Just a cowboy riding into the sunset. What is strange is that we can see this image at all. If you were to measure it, you’d find very little information in this image compared to the scene it represents, and almost none of it is accurate. Your spectrograph will tell you that all the colours are wrong. Your light meter will tell you that the light’s intensity is a tiny fraction of what was there on the day. I will not even mention sizes, distances, smells, humidity, or motion. Yet, somehow, it is enough for us to deduce all of these and more. Our brains are endlessly fascinating, but let’s focus on just one aspect of this photo. The sense of depth it depicts.
You know that the trees on the horizon are far away. You understand that the clouds and the Sun are ever further. Even if we have never seen a cowboy or a prairie, we have enough personal and evolutionary experiences to effortlessly deduce it. But what if we don’t? What if we want to see something that none of our evolutionary ancestors had to deal with? How about a protein called a porin? Here is a picture of one:
As we have all learned from our recent two-year planet-wide quarantine, the shape really matters to proteins. The image above has perspective, as does the cowboy photo, but good luck trying to figure out the porin shape from it. Would having several angles help? Why not. Here are some:
Super helpful, right?
Luckily, there is a trick you can use on a 2D screen to see the shape and really understand it; but it is a hack for your mind, more of an optical illusion than a tool. As soon as you introduce some movement, your brain will let you see the shape clearly. However, stop moving, and it’s gone in a literal blink of an eye. See for yourself; it is quite something.
If you imagine that you care about the position of every single atom in this molecule, then you can understand the problem. But what if you had a model of the molecule there on your desk instead of seeing an image of it on your screen? None of this would be a problem, regardless of the shape complexity.
The issue is the inherent limitation of 3D to 2D conversion. The shape complexity and the fact that we have no personal or evolutionary experience only make this limitation apparent. However, it isn’t causing it. You can see it anywhere, not just in porins. It would be there even if you took a photo of a cowboy on a prairie from the right angle.
The Interaction Problem
Almost every interaction we have with a digital world is a “metaphor”. For example, we swipe our fingers to “scroll” this page up. You will use the trackpad on your laptop to move an arrow on your screen over a “button”. You will move your joysticks and press key combos to make your game character wield their weapons. And as with 2D screens, these metaphors work well for us most of the time, but metaphors can only take you so far.
Take a simple game like a shape sorter, for example. Even a parrot can do it in real life:
However, you are in trouble if you want to perform a similar task on a 2D screen. This task requires specialist software, and even with it, it is not something you can do without any training. No metaphor can help us here.
I do not particularly want to compete with Eclectus, and I am by no means an expert Blender user, but over the years, I have spent many hundreds of hours in Blender and similar environments. Here is me trying to (unsuccessfully) realign three simple shapes in Blender.
If you think you can do better, download Blender and then this file and try it for yourself.
Now, equipped with this knowledge, try to imagine that your job is not only to understand, design and control these complex molecules and their shape but that lives quite literally depend on it.
Would you fight the unavoidable interface and interaction challenges of 2D screens and input devices or eliminate them? Reduce them to nothing? This is what Nanome has done. As a result, you can see with how little effort users perform tasks which would otherwise take time, skill and training. It is so effortless and intuitive that even a layman like me can follow a conversation explaining mutations in COVID-19 spike protein.
Have a look if you are interested, and excuse the constantly moving camera, but, as you have learned, that is the only way you can perceive the shape they are talking about.
Because it is irreplaceable, Nanome is useful regardless of your location. So, for example, it is as valuable for people sharing an office as it is to people collaborating on opposite sides of the planet from their bedrooms.
It is a technology, has sensory HCI, and is social with presence; therefore, by definition, this is a Metaverse app.
The glimpse into the future
I am going to go out a limb here and assume that you, dear reader, will be thinking something along the lines of “Even if you are right, and Nanome is metaversey, what does this exotic science tool have to do with me?”
If you are a cowboy on a prairie living off the grid, not very much, I guess. But if you are participating in the information age, it may show you your future in more ways than one.
Firstly, the sensory HCI. I know that many people will rightly scream, ‘VR is not a Metaverse’, and I would agree. It isn’t, which is why I didn’t write my axiom as such. But some kind of sensory HCI is crucial. VR/AR can be one facet of it — crazy sensory-expanding contraptions from the mind of David Eagleman could be the other. It is the sensory HCI which enables Metaverse to be irreplaceable and placeless. The sensory HCI will allow the technology to become an intuitive, intrinsic part of our lives. VR is just the first step on that journey. A signpost showing us what is possible to achieve when HCI becomes immersive.
Secondly, sheer coolness and the sense of adventure some people (like me) get when exploring new technology is not what is going to pull Metaverse into the mainstream; it will be utility. We will be using Metaverse technologies because they will allow us to do things we cannot achieve without them. That is why we see killer Metaverse apps appearing in the industry first.
Thirdly, we need to go over the inevitable hump of skeuomorphism. Remember page curls on PDFs and cassette audio player apps? With every technology cycle, we take a while to disambiguate what is helpful from what is just a nostalgic limitation imposed on us by the current state of technology. In VR, this trap is even wider and the bait even tastier because the experience is immersive, but it will always be better filling in a post-it® note in real life than in any virtual environment. It is easy for Nanome, as there is no way to replicate quantum weirdness in a skeuomorphic way. However, it will require a lot of experimentation and ingenuity to find novel concepts and interaction grammar to perform tasks we have yet to imagine. I, of course, have no idea what they will be, but I know they will simultaneously be abstract and intuitive, just like Nanome is.
It is also especially pleasing that a tool for designing molecular machinery in virtual reality could be an excellent plot device for a sci-fi novel. How can you get any more Metaverse than that?