Infrared depth sensors in iPhone create map of users' emotions
Computing & Tech
A new app allows a phone’s camera to detect facial expressions and emotions.
At Springwise, we have seen a number of products that use facial recognition algorithms. For example, we have seen facial recognition used for tracking the health of fish and for immigration screenings at airports. Observant, the company behind bug-reporting product Buglife, has now found a way to use the infrared depth sensors found on some iPhones to analyse facial expression. The company has designed a software development kit that works in any iOS app. The product can help companies and marketers capture user reactions to a product or a piece of content.
The app currently works on the iPhone X, XS, and XR. It offers two modes of operation: streaming emotional analysis data in real time, or taking emotion snapshots based on specific in-app events. It uses the cameras’ infrared depth sensors to map the users face in high levels of detail in almost any type of lighting. The company has also created deep learning algorithms to translate the facial data into emotions in real time. Observant argues that its app can capture subtle micro-expressions otherwise missed on webcams or eye-tracking.
The idea of tracking expressions can strike some users as an intrusion of privacy. However, Observant emphasises that it is working to ensure that all users stay informed on how the data is used. In addition, no facial footage or biometric data is uploaded. All the analysis actually happens on the users’ own device.
22nd November 2018
Email: team@observantai.com
Website: www.observantai.com
Contact: www.observantai.com/contact