The way you walk says a lot about how you feel at the moment. When a walker is excited and confident usually the walker has open shoulder, and steps, rhythm seems to be more exciting., however when the walker is sad he more of slumped shoulders. Leveraging this body language University of Chapel Hill and The University of Maryland recently developed a machine learning tool that can identify the persons perceived emotion, whether positive or negative, calm or energetic from their walk. The approach is uniquely identified with a walking posture that is all set to predict the emotion with an achieved accuracy rate of 80.07 percent accuracy in the preliminary experiments.
The authors of the research paper added that emotion plays a big role when it comes to our lives, defining our experience, shaping how we view the world and how we interact with other humans. The perceived emotion is becoming very important in our lives with automatic emotion recognition is imperative because it can lead a better understanding of the person’s health having applications such as medicine, law enforcement, shopping, and human-robot interaction.
The researchers have added four basic emotions that include happy, sad, angry, and neutral for their tendency to last an extended period and their abundance in the walking activity. The extracted gaits from the multiple walking video corpora to identify effective features before extracting poses using a 3D pose estimation technique. The AI system processes samples from emotion walk or EWalk novel data set containing 1,384 gaits extracted from videos of 24 subjects walking around a university campus both indoors and outdoors. The tests revealed that their emotion detection approach offers 13.85 percent improvement over state of the art algorithms and 24.60 percent improvement over vanilla LSTM that don’t consider effective features. The goal is to provide a real-time pipeline for emotion identification from walking videos by leveraging state of the art 3D pose estimation.