The metaverse challenges us to stretch our collective imaginations to the edges of our digital experience.
While building the navigation engine for the metaverse, we at Lighthouse have grown accustomed to thinking about its edge cases, mysteries, and unresolved dilemmas. And we’d love to invite you into some of our late-night musings.
“You get what you measure.”
“What gets measured gets managed.”
“If you don’t measure it, you can’t improve it.”
How do these well-known axioms from the world of business KPIs translate in the metaverse space? In this fourth installment of our six-part series, The Edges of the Metaverse, we explore what’s measurable in the metaverse.
These metrics also reveal valuable insights about society and the human experience, including greater ability to explore behavior in hypothetical situations and a better understanding of how communities learn and communicate.
A fully realized immersive metaverse produces more and richer metrics for tracking: biometrics, attention, demographics, experiences.
The richness of this data raises justifiable questions around privacy, security, and monetization that the industry needs to address.
A furrowed forehead indicates worry or puzzlement. The rise of a brow: surprise, with more than a hint of skepticism. Biting the lip betrays shyness or anxiety. Pursed lips? Distaste and disapproval.
Every human interaction is imbued with mostly subconscious signaling. It’s our bodies’ “tell” of how we really feel — regardless of whether we intend to communicate that feeling to others or not.
In the physical world, that information is subtle, and not always easy for people to notice or read. You’d have to be looking in the right direction and paying attention at the right time to pick up on all the body language and visual cues. But in a virtual world realized through AR and VR, that information is conveyed as data — including everything from visible reactions (facial responses, for example), to the inconspicuous ones (such as a cortisol spike in response to stress or an adrenaline rush when you feel excitement). That data is then transferred and stored somewhere — available for notice, recording, and analysis by whomever has access to it.
Tracking and analytics technology is already being implemented. Meta’s latest VR headset, the Quest Pro, includes five inward-facing cameras that watch a person’s face to track their eye movements and expression. A smile, a wink, a frown, a raised eyebrow are all observed in real time.
As Meta CEO Mark Zuckerberg said in his announcement, this information is in many ways more revealing than anything we intentionally convey: “When we communicate, all our nonverbal expressions and gestures are often even more important than what we say, and the way we connect virtually needs to reflect that too.”
Although Meta has said it won’t use that information to predict emotions or directly influence user behavior, the value of such data to companies is clear: Meta has already filed patents regarding advertisements that can be personalized in the metaverse based on a number of data points, including facial expressions.
With raw images and pictures stored on headsets, processed locally, and then deleted, Meta claims the insights gathered from that data will mostly focus on personalizing user experience. But can users trust their data to companies that have spotty records of securing it — particularly when working with third-party developers, as Meta did when it accidentally exposed countless users’ data to an analytics firm working on the 2016 U.S. Presidential campaign?
For users to have trust in the metaverse, they will need to feel comfortable with what sensory, emotional, biometric, and psychological information is tracked in it, and that the platforms they interact with are transparent about how they collect and use that data.
Here’s a glimpse at some of the things that can be tracked in the metaverse, and what that means for all of us who want to build and explore it.
In the physical world, you can be fairly sure that a visit to grandma’s house is actually a visit to grandma. But in the spatial web, how can you be sure that the avatar you’re interacting with represents the right person?
As troubling as biometric tracking is, so is identity hacking, and here, biometric tracking can help defend against the Little Red Riding Hood problem. For instance, collecting fingerprints or voice — biometrics that are already commonly used in smartphones to verify your identity — may help users verify that the avatar they’re talking to represents the right person.
Tracking the way people interact with each other in a virtual reality that is similar to our physical world could provide novel opportunities for sociological research, which runs up against statistical issue due to small sample sizes and geographic constraints.
For instance, it is exceedingly difficult to measure the impact of educational efforts in some rural communities because limited populations can make the study group statistically insignificant.
By using the metaverse to observe the interests and habits of rural students in, say, a virtual classroom setting, experts could more easily connect the data of multiple rural communities across a much larger geographic footprint — building the cumulative sample size they need to be more confident about their conclusions.
Similarly, learning about what metaverse users spend their time doing will create fascinating new avenues of research.
“The amount of attention that people give during immersive experiences is overwhelmingly the most important metric that companies measure,” writes Tom Ffiske, author of the Immersive Wire newsletter.
There is a good reason why businesses see attention metrics as so valuable: the more they know about where users focus their attention, the more they will be able to monetize it toward their products and services.
However, actually measuring attention is still a challenge, with researchers mostly focusing on time spent in XR (extended reality) experiences, as concluded in a 2022 study produced by Gorilla in the room, a company that applies cognitive science to consumer research and health care.
“XR data analytics haven’t kept pace with creative and technological advancements so the industry needs scalable metrics which are unique to 3D experiences as opposed to legacy 2D media,” said Jonathan Barrowman, CEO of Gorilla In the room.
It’s easy to foresee how experiential data could be used to glean new insights about physical world behavior. However, the metaverse could also open the door to new avenues for exploration, including enabling a deeper analysis of what humans might do in hypothetical situations.
This matters because, as most of us might grudgingly admit, what we say we would do in theory often doesn’t match what we would actually do in practice.
Consider the classic “trolley problem,” a thought experiment in which a person must choose whether to not act, and let five people die, or intervene by changing the route of the trolley, which would then kill one person.
When asked in various surveys, 90% of people said they would choose to intervene, leading to the death of one person but saving the other five. However, researchers conducted a realistic study of that thought experiment in 2017, with participants placed in a train-switching station and shown footage that they thought was real, recreating the predicament. They were given the same choice, including the option to pull the lever and divert the train toward a secondary track with just one person on it, but most of the participants did not pull the lever.
The circumstances don’t have to be quite so grim, of course. But by tracking the types of activities and experiences that users have in the metaverse, we may be able to glean better information about human behavior, desires, and hopes.
For example, what would people do if they could suddenly be anywhere in the world at a moment’s notice? What about if they could fly, or be invisible? Or, perhaps, an even more important question: Given the power to do or be anything, what would people choose — and would it actually make them happier.
Tracking the demographics of users within the metaverse provides valuable information about who uses virtual worlds and how. This includes age, gender, geographic location, and other factors.
There are major concerns about how this data will be used, of course. Algorithms across industries, from hiring tools to crime-prevention software, struggle to avoid bias when relying on data sets. Relying too heavily on demographic data to study the metaverse could perpetuate discrimination, despite efforts to the contrary.
For that reason, metaverse operators may want to focus more on demographic data as a tool for observation and understanding, rather than as a data set driving algorithmic decision-making.
Some demographic data will be absolutely essential to understanding how various communities experience the metaverse. This is also the case in the physical world, where, without such data, we might not know about the disproportionate number of people of color shot by police, for example.
By carefully managing decisions around privacy and anonymity, metaverse builders can help produce data that supports communities that have been marginalized or little understood, enabling an experience that is empowering rather than inhibiting.