Multimodal annotation is time-consuming, and Red Hen is involved in a series of projects that recruit the help of computers to simplify the task. Computer Scientists are actively working on developing tools that develop new classifiers that encode the regularities and patterns in a particular set of manual annotations. Such classifiers can in turn be used to propagate the manual annotations to a larger dataset robotically. Such automated gesture recognition can then generate new metadata that makes new forms of communication research possible.
The methods are imperfect, and the types of manual annotations rich and varied, so high-quality classifiers typically need feedback from the user in a recursive learning process. One of Red Hen's goals is to integrate Elan into such semi-supervised machine learning systems.
Professor Ray's project aims to develop the ability to recognize timeline gesture in videos, based on the manual annotations of 140 instances contributed by Red Hen researchers Javier Valenzuela and Cristóbal Pagán Cánovas.
Red Hen is proposing several projects for Google Summer of Code 2016 focused on machine learning; see Ideas page.