Phaceology software uses emotion analysis and AI to identify 68 specific points on the face through standard video output. Analyzing these points with an algorithm, the system identifies features as emotions, gender and age.
Phaceology also analyzes non-verbal factors, such as body positioning, gestures, posture, general mobility, and overall response time to certain stimuli. In the future, this technology will be able to verify verbal factors, such as patterns of speech, or non-linguistic sound responses like a sigh or yawn.
You can use the technology as a embedded system in your devices or use our API service to send video and receive data in real time, via the internet.
As Phaceology gets real-time scores of people’s emotions, it records those scores and provides valuable data to improve customer relations. Each day, employees and locations will get average scores for the types of emotions elicited.