23 Sep '21
24 Sep '21
“A “map” is built on the face from points that are significant for determining the emotions experienced, which are combined into geometric shapes and separate groups of points - markers, the movement of which is monitored by algorithms. By analyzing the distance from the center of the face to the marker, the distance between the markers and their changes, the algorithm predicts what emotion a person is experiencing at the moment. Simply put, these markers often represent signs of emotion that are visible to the human eye, for example, folds on the forehead as an expression of anger or surprise,” says Laurent Hakobyan, General Director of iPavlov, Director of Applied Software Development at the NTI Competence Center in the direction of Artificial Intelligence.In addition, the AI will be able to analyze the voice by its intensity, volume, pause length and other parameters. For example, happiness can be expressed in shorter periods of constant pitch, and long, intense speech is indicative of negative emotions being experienced. The analysis of the employee's condition takes a maximum of 5 seconds. The system was trained on data from both open sources and closed tagged datasets.
“There is no exact information on the current volume of the emotion detection and recognition market in open sources, but in 2017 it was just over 280 million rubles and had a dozen participants. Our company wants to occupy about 30% of this area in Russia,” the team representatives say.