Gesture detection 2017
Sergiy Turchyn's 2017 Masters Thesis project at CWRU, supervised by 's Professor Soumya Ray, aims to develop the ability to detect the presence or absence of gestures in videos, using motion detection (see Tagging for Likelihood of Gesture Data). It also aims to recognize and label timeline gestures, based on the manual annotations of 140 instances contributed by Red Hen researchers Javier Valenzuela and Cristóbal Pagán Cánovas. The code was run on Red Hen's English corpus by Dr. Peter Uhrig at FAU Erlangen and the results are searchable in the 2018 version of CQPweb.
- Tagging for Likelihood of Gesture Data
- Red Hen Rapid Annotator
- Machine Learning
- Video processing pipelines
Work by Sergiy Turchyn
Gestures are an integral part of human communication. This thesis presents a visual search engine framework that helps users efficiently locate gesture fragments of interest in long videos of television programs. In order to build such a system, we integrate various modules that detect when people or speakers are on screen along with body part motions including head, hand and shoulder motion. We also provide a detector for a specific class of gestures known as timeline gestures. The system automatically annotates videos with the results of these detectors. An existing gesture annotation tool, ELAN, can be used with these annotations to quickly locate gestures of interest. Finally, we provide an update mechanism for the detectors based on human feedback. We empirically evaluate the detectors to demonstrate their accuracy as well as present data from pilot human studies to show the effectiveness of the overall system.
Slideshow (with videos):