Manual tagging
Red Hen provides several tools for manual tagging, suited to different circumstances and purposes:
- For frame-accurate manual annotations, the preferred tool is ELAN (for a basic introduction, see How to annotate with ELAN)
- To draw rectangles on images to indicate the location of a particular feature, we use a script from the iMotion team (see How to set up the iMotion annotator) or vatic
- In the future, we may develop frametrail, an html5-based online video annotator
- For online annotations, we use NewsScape's integrated online annotation tool (see How to use the online tagging interface)
- For presentations and talks, we use UCLA's Video Annotation Tool (see How to use the Video Annotation Tool)
Red Hen aims to incorporate annotations made in all of these ways into its metadata repository, so that the results are searchable and when desired can be used for machine learning. So far, only NewsScape's online tagging interface is fully integrated. We invite contributions to create the import and export script to the other tools.
Would you like to accomplish all or part of this task?
If so, write to
and we will try to connect you with a mentor.
Related pages
- Integrating ELAN (desktop tagging with export to Red Hen)
- How to annotate with ELAN (basic introduction)
- How to set up the iMotion annotator (draws rectangles on images to indicate event location)
- How to use the online tagging interface (integrated into Red Hen, but not frame accurate)
- How to use the Video Annotation Tool (online multi-dimensional video annotation interface for talks and demos)
Tagging schemes
- Categories. What are the major categories of tags that should be created? E.g. Segment, Gesture, Named Entity Recognition, etc. See How to Use the Online Tagging Interface for the current options.
- Annotation Timeline. Can we design snapshot visualizations of the complex structure of tags (manual or automatic) attached to a recording or an interval of a recording?
- Gestures. What structure is needed for a gesture tag? See How to Use the Online Tagging Interface for the current options.
- Irene Mittelberg and her group have composed a list, with photographs, of examples of gestures for which we might tag. Most of these gestures enact Jana Bressem's form categories.
- Some more examples of gestures for which we might tag.
- A beginning tutorial for tagging in ELAN is available at http://gesturegroup.wikispaces.com/file/view/ELAN%20tutorial.pdf/31716527/ELAN%20tutorial.pdf . Full manuals are at https://tla.mpi.nl/tools/tla-tools/elan/.
- Multiple file N-gram analysis in ELAN. This is an extension of ELAN functionality contributed by Larwan Berke and Rosalee Wolfe. It produces N-gram statistics for a selection of tiers. The size of the N-gram can be set and the results can be exported to tab-delimited text. The raw data contain the results of many more algorithms than shown in the statistics overview and all these data can be exported tab-delimited as well. This pdf explains the rationale behind the development of this corpus analysis tool and describes in more detail the algorithms applied. Author: Larwan Berke (Gallaudet University). See https://tla.mpi.nl/tools/tla-tools/elan/thirdparty/
- Export options for ELAN: http://www.mpi.nl/corpus/html/elan/ch04s03s02.html. See section 4.3.2.5. Tab-delimited text file. All documents can be exported into a tabular format for purposes of further analysis and/or printing. This includes documents that were created by ELAN itself (see Section 4.2.1 and Section 4.2.4) as well as documents that were imported into ELAN from Shoebox (see Section 4.3.1.8)
- Jana Bressem's paper: http://www.janabressem.de/wp-content/uploads/2016/10/Bressem_notational-system-overview_final.pdf
- What does a tagging scheme look like? To propose a new tagging scheme, use the format
- <tag name> <field name> <select-multi or text> <value>.
- Here is an example of a gesture tagging scheme:
- GES Type select-multi left hand
- GES Type select-multi right hand
- GES Type select-multi both hands
- GES Type select-multi point
- GES Type select-multi trajectory
- GES Type select-multi telic
- GES Type select-multi atelic
- GES Type select-multi viewpoint
- GES Type select-multi iconic
- GES Type select-multi metaphoric
- GES Orientation select-multi sagittal
- GES Orientation select-multi lateral
- GES Topic select-multi time
- GES Topic select-multi emotion
- GES Topic select-multi argument
- GES Comment text
- https://sites.google.com/site/distributedlittleredhen/home/profiles-of-red-hen-participants/what-kind-of-red-hen-are-you#tagscheme
- Databricks.
- New form for manual gesture tagging:
- Categories included:
- type of gesture (McNeill, 1992);
- type of gesture (McNeill, 1992);
- type of gesture (McNeill, 1992);
- body part(s) involved;
- body part(s) involved;
- special categories for hands (configuration and orientation);
- movement (for any kind of body part);
- synchronization of gesture with speech and/or image(s);
- gesture phases (when only one of them is described);
- miscellany defined according to specific research needs that may be useful for the whole group: e.g. topic; image-schemas; free text fields, etc;
- tagging status (draft vs. final).
Basic types according to broad form-function relation (based on McNeill, 1992)
GES Type select-multi deictic
GES Type select-multi metaphoric
GES Type select-multi beat
GES Type select-multi iconic
GES Type select-multi emblem
Body part(s) involved
GES Body part select-multi left hand
GES Body part select-multi right hand
GES Body part select-multi both hands
GES Body part select-multi head
GES Body part select-multi gaze
GES Body part select-multi eyebrows
GES Body part select-multi other face part
GES Body part select-multi trunk
GES Body part select-multi shoulders
GES Body part select-multi other (see field Comments)
Hand configuration
GES Hand–form clusters select-multi fist
GES Hand–form clusters select-multi flat hand
GES Hand–form clusters select-multi single finger
GES Hand–form clusters select-multi combination of fingers
GES Hand–numbering of fingers select-multi 1=thumb
GES Hand–numbering of fingers select-multi 2=index
GES Hand–numbering of fingers select-multi 3=middle finger
GES Hand–numbering of fingers select-multi 4=ring finger
GES Hand–numbering of fingers select-multi 5=little finger
GES Hand–shape of digits select-multi stretched
GES Hand–shape of digits select-multi bent
GES Hand–shape of digits select-multi crooked
GES Hand–shape of digits select-multi flapped down
GES Hand–shape of digits select-multi connected
GES Hand–shape of digits select-multi touching
Hand orientation
GES Hand orientation select-multi palm-up
GES Hand orientation select-multi palm-down
GES Hand orientation select-multi palm-lateral
GES Hand orientation select-multi palm-vertical
GES Hand orientation select-multi palm-diagonal
Movement
GES Movement–axis select-multi vertical
GES Movement–axis select-multi horizontal/lateral
GES Movement–axis select-multi sagittal
GES Movement–axis select-multi diagonal
GES Movement–direction select-multi upward
GES Movement–direction select-multi downward
GES Movement–direction select-multi leftward
GES Movement–direction select-multi rightward
GES Movement–direction select-multi towards body
GES Movement–direction select-multi away from body
GES Movement–direction select-multi diagonal: right up
GES Movement–direction select-multi diagonal: left up
GES Movement–direction select-multi diagonal: right down
GES Movement–direction select-multi diagonal: left down
GES Movement–direction select-multi circular: clockwise
GES Movement–direction select-multi circular: counterclockwise
GES Movement–direction select-multi towards an object[i1]
GES Movement–shape select-multi straight
GES Movement–shape select-multi arced
GES Movement–shape select-multi circular
GES Movement–shape select-multi zigzag
GES Movement–shape select-multi s-line
GES Movement–shape select-multi spiral
GES Movement–size select-multi reduced
GES Movement–size select-multi enlarged
GES Movement–speed select-multi accelerated
GES Movement–speed select-multi decelerated
GES Movement–flow select-multi accentuated
GES Movement–flow select-multi repetitive
Synchronization
GES Synchronization select-multi synchronized with speech
GES Synchronization select-multi before speech
GES Synchronization select-multi after speech
GES Synchronization select-multi synchronized with image
GES Synchronization select-multi before image
GES Synchronization select-multi after image
Phases
GES Phases select-multi preparation
GES Phases select-multi stroke
GES Phases select-multi retraction
Other fields (miscellany)
GES Topic select-multi time
GES Topic select-multi emotion
GES Topic select-multi argument
GES Object designated select-multi physical (on TV set)
GES Object designated select-multi image
GES Object designated select-multi announced/expected image
GES Object designated select-multi expected/announced person
GES Function select-multi depictive
GES Function select-multi interactive
GES Link to grammatical category text (e.g. subject, object, verb, adverb, etc.)
GES Image-schema(s) text (e.g. container, source-path-goal, balance, etc.)
GES Comments text
GES Transcript text
GES Query text
GES Co-speakers’ behavior text
Tagging status
GES Status select-multi draft
GES Status select-multi final
General overview of new gesture tagging form
- type of gesture: deictic, metaphoric, beat, iconic, emblem (McNeill, 1992);
- body part(s) involved;
- special categories for hand gestures (configuration and orientation);
- movement features;
- synchronization of gesture with speech and/or image(s);
- gesture phases;
- research-oriented miscellany: e.g. topic; image-schemas; free text fields, etc;
- tagging status (draft vs. final).