Modelo e ferramenta para reconhecimento e classificação de gestos do corpo
Visualizar/ Abrir
Data
2017-08-18Autor
Brasil, Gustavo Jordan Castro
Metadata
Mostrar registro completoResumo
Multimodal interfaces are becoming more popular, and increasingly require natural interaction as a resource to enrich the user experience. Computational systems that support multimodality provide a more natural and flexible way to perform tasks on computers, since they allow users with different levels of skills and knowledge to choose the mode of interaction best suited to their needs. Among natural forms of interaction are the gestures, the natural interaction through gestures is becoming more popular, since it's an alternative to the conventional style of interaction based on keyboard and mouse, and also by the growth and advent of motion capture devices, with low-cost visual depth sensors. In this context, this dissertation presents a study about all the necessary steps for the construction of a model and tool for the recognition of static and dynamic gestures, these being: Segmentation; Modeling; Description; and Classification. Proposed solutions and results are presented for each of these steps and, finally, a tool that implements the model is evaluated in the recognition of gestures, using a finite set of gestures. All the solutions presented in this dissertation were encapsulated in the GGGesture tool, which aims to simplify research in the area of gesture recognition, allowing communication with multimodal interfaces systems and natural interfaces.