//

Facial expressions can now be detected in the metaverse

A University of South Australia study has investigated the use of facial expressions as a mode of interaction within the virtual metaverse environment.

322 views

Virtual reality, also known as VR, is a digitally simulated and an artificially computer-constructed medium that provides an immersive and interactive metaverse experience. Traditionally, VR environmental interactions have required the use of hand-held controllers. However, recent technological advancements have shifted towards the use of alternative, hand-free methods. While multiple hands-free techniques have been developed, including eye-tracking, head movement and body-based movements, an international research team led by Dr. Arindam Dey from the University of South Australia sought to investigate a first-time ever alternative: facial expressions.

The primary rationale behind the study, as Dr. Dey describes, “was to make the metaverse more accessible and inclusive,” as most VR headsets require users to be “at least partially physically able in order to interact in VR.” 

In the research study, facial expressions were utilised as a mode of interaction with the virtual environment. Interactions made by facial expression users were in the form of both spatial navigation commands and action-based tasks. Navigation commands and action-based tasks conducted by facial expression participants were evaluated against the performance of hand-held controller participants. They were also scrutinised in categories of physical, quantitative measures and subjective, qualitative variables. 

The researchers explored and investigated their hypothesis: can facial expressions be used as a mode of interaction within the virtual, metaverse environment?

The researchers sought to evaluate facial expressions use by adopting an experimental approach, as controller users were assigned to a control group and facial expression users as an experimental group. 

Once participants were randomly assigned to their respective groups, a VR headset and a wireless electroencephalogram (EEG) were attached, capturing facial expressions and recording neurological activity signaled from the frontal and parietal lobes of the brain. 

The researchers also measured electrodermal data by recording electrical signals released from skin sweat secretion and heart rate – a direct and proportionate measure of physiological responses.

The researchers then subjected both the control and experimental groups to three 4-minute-long conditions: happy, neutral and scary. In the happy condition, participants were instructed to walk through a virtual park and catch butterflies. The neutral condition consisted of participants walking through a virtual workshop and picking up various items. And in the scary condition, participants were asked to walk in an underground base and shoot zombies. 

The control group used physical controllers to both navigate and perform actions. However the facial expression group used a smile to initiate movement, a frown to end movement and a jaw-clench to initiate an action. An action, in the context of the study, refers to the act of catching butterflies in the happy condition, picking up items in the neutral setting and shooting zombies in the scary environment. 

By subjecting both groups to the three conditions, the researchers were able to test the subjectively qualitative variables of presence, emotions and usability, as reported by participants throughout the study. The criterion of presence refers to the sense of actually being in the virtual environment, or a sense of total immersion. Usability refers to the ease of system use, and emotions include measured emotional states. Using the three subjective, qualitative criteria and the two quantitative physical measures, the researchers found three notable results.

The first notable result was an interesting neurological finding: higher gamma waves were recorded in facial expression participants. This suggests that increased facial muscle use is associated with higher cognitive load, or capability. 

The second result showed that the ease of usability, or usability criterion, scored lower for facial expression users than controller users. With additional facial expression practice and training, ease of system use could increase in facial expression users.

The third result demonstrated that controller users reported higher skin conductance rates, while facial expression users reported a higher sense of immersion, or presence. This signifies a statistical indifference in emotional arousal and dominance between both groups, illustrating the notion that facial expression users carried out tasks without any additional emotional burden, while also feeling totally immersed within the virtual environment. 

The implications of the research are scientifically vast. The use of only facial experiences allows the opportunity for amputees or physically disabled populations to engage and interact within the VR world without any additional emotional or physical burden. 

The research study also furthers AI recognition capabilities of more than just facial muscular changes: it also allows detecting users’ emotional states as a new study paradigm, an interesting field that has yet to be fully developed. 

The study also opens up the possibility for disabled populations to use digital devices as lifestyle accompaniments and a means of interaction with the non-virtual environment. 

The study highlighted the rapid developmental pace of both VR technologies and overall metaverse applications. Because of the rapid pace and broadening scope of the metaverse, the benefits of joining and investing within the virtual realm are immense, as investors buy and sell virtual land, NFTs and even cryptocurrency. Despite its prominence and accessibility, the act of actually investing in the metaverse may be difficult to perform, and detailed guides on how to invest in the metaverse are required, as well as learn effective investment techniques.

Rapid advancements within the metaverse landscape have not only changed our perception of reality, but they have also transformed our tangible reality into an immersive, intangible, virtual world. And by using a smile, a frown, or even a jaw-clench, it becomes possible to interact with a virtual world.

Zaid Elayyan is a Neuroscience and Pre-Medical Studies graduate of the University of Tennessee, Knoxville, concentrating in Cognitive Behavioral Neuroscience, Evolutionary Neuroscience and Science Writing. Throughout his academic career, Zaid served as a Research Assistant within the Neuroscience Department, performed course-based Evolutionary Neuroscience research for the Ecology and Evolutionary Biology Department, shadowed Emergency Department physicians at the University of Tennessee Medical Center, and was involved in a multitude of on-campus Neuroscience societies as a science writer. He is a CITI certified Social and Behavioral Neuroscience researcher and is accredited with Excellent Leadership by the National Society of Leadership and Success. Zaid is an active Science Writer with Breakthrough, and is in pursuit of his Master's Degree.