Loading…

Orientation sensing for gesture-based interaction with smart artifacts

Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as ‘smart artifacts’), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or ‘...

Full description

Saved in:
Bibliographic Details
Published in:Computer communications 2005-08, Vol.28 (13), p.1552-1563
Main Authors: Ferscha, Alois, Resmerita, Stefan, Holzmann, Clemens, Reichör, Martin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as ‘smart artifacts’), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or ‘by-hand’) manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as ‘gestures’ in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.
ISSN:0140-3664
1873-703X
DOI:10.1016/j.comcom.2004.12.046