Surveillance systems often use biometrics to identify key aspects of individuals. A previous blog post discussed the use of gait recognition, behavioral biometrics, and even cardiac signatures for this purpose. But without doubt, the main technique here is facial recognition, and this has been discussed many times on this blog. The problems with this approach are now well understood, and there are growing calls to regulate or even ban facial recognition. One particular aspect that is troubling many people is the use of AI to detect “micro-expressions“. The draft AI Regulation published by the European Commission describes AI systems used by law enforcement to detect the emotional state of a person as “high risk”. That means their deployment will require stringent controls. Micro-expressions are not the only way that biometric systems seek to analyze what is happening inside people’s heads. Another system is VibraImage from Elsys Corp, based in Russia. Here’s how it works according to the company:
VibraImage technology measures micromovement (micro motion, vibration) of a person by capturing picture of standard digital, web or television cameras and image processing. Human head microvibration are linked with the vestibular-emotional reflex (VER) and depends on emotional status. The VibraImage system detects human emotions (by measuring 3D head-neck movements using several frames of video). VibraImage is a system that detects all human emotions!
VibraImage is a new type of image, it is as primary as an original color image, a thermal image, or an X-ray image. Each type of image gives new and unique information about the object. Every pixel of VibraImage indicates vibration parameters — frequency or amplitude of vibration. One frame imaging of vibration frequency and amplitude is called an external VibraImage and looks like a human aura. External VibraImage (vibra-aura) colors are determined by vibration frequency and size is determined by the amplitude of vibrations.
Although hardly a household name in the West, VibraImage technology has been around since 2001, and is used in “thousands of systems around the world” according to the company. Elsys Corp says that since 2007, VibraImage has successfully detected “criminals, terrorists and suspicious behavior” at two Russian airports. An article on The Conversation about VibraImage by James Wright, who is a Research Associate at the UK’s Alan Turing Institute, adds that VibraImage has been deployed in several high-profile contexts, including two Olympic games, a Fifa World Cup and a G7 Summit.
In Japan, clients of such systems include one of the world’s leading facial recognition providers (NEC), one of the largest security services companies (ALSOK), as well as Fujitsu and Toshiba. In South Korea, among other uses it is being developed as a contactless lie detection system for use in police interrogations. In China, it has already been officially certified for police use to identify suspicious individuals at airports, border crossings and elsewhere.
Wright notes that supporters of the technology claim that VibraImage can be used to determine personality type, for example identifying adolescents more likely to commit crimes, or categorizing intelligence types, and even establishing loyalty to a company or nation – or lack of it. However, as Wright says:
many claims made about its effects seem unprovable. Very few scientific articles on VibraImage have been published in academic journals with rigorous peer review processes – and many are written by those with an interest in the success of the technology. This research often relies on experiments that already assume VibraImage is effective. How exactly certain head movements are linked to specific emotional-mental states is not explained. One study from Kagawa University of Japan found almost no correlation between the results of a VibraImage assessment and those of existing psychological tests.
The person behind VibraImage, and founder of Elsys Corp, Viktor Minkin, has written a response to Wright’s criticisms. He emphasizes that VibraImage is not an AI technology, but that it is based on “understandable physics & cybernetics & physiology principles and transparent equations for emotions calculations”. That’s an issue that is best settled by experts in these fields. What is more relevant for this blog are the general privacy issues that VibraImage raises.
One is that it can be used without subjects being aware of the fact. This means that it is yet another system that can be deployed for covert surveillance. That’s particularly problematic given the claim that VibraImage can give an “emotional analysis” within a minute, which calculates a person’s levels of “neuroticism, inhibition, self-regulation, energy, charming, mental balance, suspect, stress, anxiety, and activity”. It’s easy to imagine situations in sensitive locations such as airports where the authorities might feel the need to act swiftly based on readings that suggest someone in a crowd is suspicious, stressed and anxious. Moreover, assuming the best-case scenario of someone being peacefully arrested as a result, they are likely to become stressed and anxious purely by virtue of the arrest. In this sense, the technology’s analysis is self fulfilling.
It’s also hard to see how people can refute VibraImage claims about their emotions, since there are no comparable technologies they can turn to in order to provide an alternative viewpoint. Until much more research has been done into the reliability of approaches using small-scale biometric readings to make major claims about a person’s most private interior state, this kind of technology definitely needs to remain in the “high risk” category.
Featured image by auntmasako.