Mobile Interaction with the Real World

Similar documents
Philip Smit, Peter Barrie, Andreas Komninos, and Oleksii Mandrychenko

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

Affordance based Human Motion Synthesizing System

This list supersedes the one published in the November 2002 issue of CR.

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Toward an Augmented Reality System for Violin Learning Support

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

The Control of Avatar Motion Using Hand Gesture

Journal Title ISSN 5. MIS QUARTERLY BRIEFINGS IN BIOINFORMATICS

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Classification for Motion Game Based on EEG Sensing

Service Robots in an Intelligent House

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Magic Touch A Simple. Object Location Tracking System Enabling the Development of. Physical-Virtual Artefacts in Office Environments

Estimation of Folding Operations Using Silhouette Model

3D Data Navigation via Natural User Interfaces

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

A Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

Nonvisual, distal tracking of mobile remote agents in geosocial interaction

Available theses in robotics (March 2018) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin

PROJECT FINAL REPORT

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

AirTouch: Mobile Gesture Interaction with Wearable Tactile Displays

Outline. Comparison of Kinect and Bumblebee2 in Indoor Environments. Introduction (Cont d) Introduction

Virtual Environments. Ruth Aylett

Gesture Recognition with Real World Environment using Kinect: A Review

Saphira Robot Control Architecture

Wirelessly Controlled Wheeled Robotic Arm

Virtual Reality Devices in C2 Systems

Kissenger: A Kiss Messenger

The use of gestures in computer aided design

Computer-Augmented Environments: Back to the Real World

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

COMPARATIVE STUDY AND ANALYSIS FOR GESTURE RECOGNITION METHODOLOGIES

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Organic UIs in Cross-Reality Spaces

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

A SURVEY ON HAND GESTURE RECOGNITION

Face Registration Using Wearable Active Vision Systems for Augmented Memory

Avatar: a virtual reality based tool for collaborative production of theater shows

Research Seminar. Stefano CARRINO fr.ch

Navigation of Transport Mobile Robot in Bionic Assembly System

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Vocational Training with Combined Real/Virtual Environments

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

My project is based on How museum installations could be combined with gesture technologies to make them more interactive.

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Introduction to Talking Robots

An Accelerometer-Based Gesture Recognition Algorithm and its Application for 3D Interaction

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

Towards Complex Human Robot Cooperation Based on Gesture-Controlled Autonomous Navigation

Spatial Mechanism Design in Virtual Reality With Networking

Advanced Man-Machine Interaction

Multi-Platform Soccer Robot Development System

A Kinect-based 3D hand-gesture interface for 3D databases

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Toward a Design for Teaching Cognitive Robotics. Matthew D. Tothero Oskars J. Rieksts

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Towards affordance based human-system interaction based on cyber-physical systems

How Students Teach Robots to Think The Example of the Vienna Cubes a Robot Soccer Team

May Edited by: Roemi E. Fernández Héctor Montes

Robot Task-Level Programming Language and Simulation

Development of PetRo: A Modular Robot for Pet-Like Applications

6 Ubiquitous User Interfaces

The User Activity Reasoning Model Based on Context-Awareness in a Virtual Living Space

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Boneshaker A Generic Framework for Building Physical Therapy Games

Computer Animation of Creatures in a Deep Sea

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Session 2: 10 Year Vision session (11:00-12:20) - Tuesday. Session 3: Poster Highlights A (14:00-15:00) - Tuesday 20 posters (3minutes per poster)

Combining Body Sensors and Visual Sensors for Motion Training

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

A CYBER PHYSICAL SYSTEMS APPROACH FOR ROBOTIC SYSTEMS DESIGN

Hand Gesture Recognition System for Daily Information Retrieval Swapnil V.Ghorpade 1, Sagar A.Patil 2,Amol B.Gore 3, Govind A.

Interface Design V: Beyond the Desktop

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

CS295-1 Final Project : AIBO

Tableau Machine: An Alien Presence in the Home

Biomedical sensors data fusion algorithm for enhancing the efficiency of fault-tolerant systems in case of wearable electronics device

One computer theorist s view of cognitive systems

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ReVRSR: Remote Virtual Reality for Service Robots

Hand & Upper Body Based Hybrid Gesture Recognition

On Mapping Sensor Inputs to Actions on Computer Applications: the Case of Two Sensor-Driven Games

DESIGN A MODEL AND ALGORITHM FOR FOUR GESTURE IMAGES COMPARISON AND ANALYSIS USING HISTOGRAM GRAPH. Kota Bilaspur, Chhattisgarh, India

The Application of Virtual Reality Technology to Digital Tourism Systems

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

TELLING STORIES OF VALUE WITH IOT DATA

Image Manipulation Interface using Depth-based Hand Gesture

II. LITERATURE SURVEY

Research on Hand Gesture Recognition Using Convolutional Neural Network

Transcription:

Andreas Zimmermann, Niels Henze, Xavier Righetti and Enrico Rukzio (Eds.) Mobile Interaction with the Real World Workshop in conjunction with MobileHCI 2009 BIS-Verlag der Carl von Ossietzky Universität Oldenburg

Oldenburg, 2009 Verlag / Druck / Vertrieb BIS-Verlag der Carl von Ossietzky Universität Oldenburg Postfach 2541 26015 Oldenburg E-Mail: bisverlag@uni-oldenburg.de Internet: www.bis-verlag.de ISBN 978-3-8142-2177-X

Gestural Control of Pervasive Systems using a Wireless Sensor Body Area Network Oleksii Mandrychenko, Peter Barrie, and Andreas Komninos Glasgow Caledonian University 70 Cowcaddens Road, Glasgow G4 0BA, UK Abstract This paper describes the prototype implementation of a pervasive, wearable gestural input and control system based on a full body-motion-capture system using low-power wireless sensors. Body motion is used to implement a whole body gesture-driven interface to afford control over ambient computing devices. 1 W-BAN BODY GESTURE CAPTURE Our system is comprised of sensor nodes that can be attached to key locations on a user s body, monitoring the movement of major body parts, detailed technically in [2]. An internal processing system provides us with an updatable skeleton model of the use, which is a method also used by other researchers, e.g. [3]. The posture of the skeleton is calculated in real-time through forward kinematics. Kinematics simplifies computations by decomposing any geometric calculations into rotation and translation transforms. Orientation is obtained by combining (or fusing) these information sources into a rotation matrix an algebraic format that can be directly applied to find the posture of the user. The result is a simple skeletal model defined as a coarse representation of the user. In general terms, gesture recognition consists of several stages, like feature extraction, pre-processing, analyzing and decision-making. Our experimental method consists of using linear angles between any two links in the skeletal model as a dataset that is fed into the gesture recognition algorithms described below. Analyzing 141

sequences of linear angles and performing the gesture recognition itself was implemented with the help of AMELIA general pattern recognition library [6], which we used as a basis to implement our own customized Hidden Markov Model. Our system allows users to record their own gestures for predefined actions that control the behaviour of ambient computing devices. As such, different actors may use diverse gestures, which can combine multiple body parts moving in different ways, for the same action. Typically, to record one gesture an actor repeats it for 3-4 times, as in [1] [5]. Once a few recordings of a gesture have been made, the system is then trained on the captured motion data set in order to be able to recognize the gestures. After training, the user can perform gestures in different sequences as well as performing actions that are not gestures. Our system recognizes gestures with the probability of 80-90% (determined experimentally). Examples of our gesture recognition systems are available to view online in video form 1. At this point in time, our system has two limitations: Firstly, saving of the recorded gestures training data is not yet implemented (due to developmenttime constraints) but we consider it as a simple goal. Secondly, our current recognition model does not allow a gesture to stop in the actor s relaxed position. For example, if a user stands still and tries to record a gesture, finishing it at the relaxed posture, the recognition system will not determine when the gesture ends. However, this limitation will be removed in the near future. 2 CONCLUSIONS & FURTHER WORK Our system is comparable to existing commercial offerings (e.g. XSens, EoBodyHF). These systems use sets of wired sensor packs, connected to a wireless hub, which transmit aggregated data wirelessly using Bluetooth or 802.15.4 respectively. Our system s advantage is that all sensors are wirelessly connected to a coordinator/transmitter node, which allows for improved wearability and flexibility in the configuration of the system, for full or partial body motion capture. We are particularly interested in its potential in mixed reality situations for gaming. We also wish to investigate issues in human-human interaction through embodied agents, controlled 1 http://www.mucom.mobi/projects/bodyarea 142

through the motion capture system. We are looking into the control of VR agents, as well as robotic agents for which the metaphor of transferring one s soul will be used to investigate response and interaction with other humans. Finally, we are interested in pursuing applications in tangible interfaces and semi-virtual artifacts, as well as gesture-based whole-body interaction with large situated displays. We hope to be able to create new types of human-computer interfaces for manipulating program windows, arranging or opening files using ad-hoc large projected or semi-transparent situated displays. 3 REFERENCES [1] Philip Smit, Peter Barrie, Andreas Komninos. Mirrored Motion: Pervasive Body Motion Capture using Wireless Sensors. Whole Body Interaction Workshop, ACM CHI2009, Boston MA. [2] Crossan, A., Williamson, J., Brewster, S., Murray-Smith, R., 2008. Wrist rotation for interaction in mobile contexts. In Proceedings of the 10th international conference on Human computer interaction with mobile devices and services. Amsterdam, The Netherlands: ACM, pp. 435-438. [3] S. Rajko, G. Qian, T. Ingalls and J. James, Real-time Gesture Recognition with Minimal Training Requirements and Online Learning, IEEE Conference on Computer Vision and Pattern Recognition, 2007. [4] Bodenheimer, B., Rose, C., Pella, J., Rosenthal, S. 1997. The process of motion capture: Dealing with the data. Computer Animation and Simulation. pp. 3-18 [5] AMELIA: A generic library for pattern recognition and generation: http://ame4.hc.asu.edu/amelia/ (link valid 5/09) 143