Virtual reality allows an unprecedented amount of detail to be compacted within a digital environment. With this, programmers and content developers have been relentlessly pursuing ways to enhance ‘presence,’ create suitable haptics and decipher information from the user, such as eye-tracking. Many also refer to VR as an empathy machine. Speaking of empathy, a VR dedicated company is developing a way for computers to read facial expression in VR.
Emteq has set their sights on developing a haptic platform that would provide facial data to a computer in real-time, dubbed ‘FaceTeq.’ As explained in their blog, “FaceTeq is a platform technology that uses novel sensor modalities to detect the minute electrical changes that occur when facial muscles contract.” It is fact, that when communicating with another person, we focus heavily in eye region to decipher their current state of mind.
With virtual reality headsets, our eye region is mostly covered, creating a obstacle in retrieving that valuable information that would enhance ‘presence’ in VR. Such information can help in aiding artificial intelligence or making realistic virtual avatars. What Emteq has done, was develop an array of sensors around the eyes, that detects minute muscle contractions due to electrical signals being sent to our face.
Founder and Chief Scientific Officer, Charles Nduka, believes this would greatly benefit people rehabilitating from facial palsy due to damaged nerves. People suffering from damaged face nerves, either due to tumor, Bells palsy or unfortunate accident, tend to find it much harder to socialize or practice facial movement in front of a mirror during recovery. FaceTeq would allow these individuals to enter a virtual environment and practice their facial movements in a judge-free place.
There of course some other unforeseen benefits of face-capture technology embedded in VR headsets, but for now you can visit their website to learn more about their developments.
Thanks for reading! Let me know your take on FaceTeq with a comment below:)