The use of gestures in computer aided design

Similar documents
Geometric elements for tolerance definition in feature-based product models

Geometric reasoning for ergonomic vehicle interior design

Environmental control by remote eye tracking

Eye-centric ICT control

Representing human movement and behaviour in virtual environment using gaming software

R (2) Controlling System Application with hands by identifying movements through Camera

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

REPRESENTATION, RE-REPRESENTATION AND EMERGENCE IN COLLABORATIVE COMPUTER-AIDED DESIGN

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Move with science and technology

VIRTUAL REALITY AND RAPID PROTOTYPING: CONFLICTING OR COMPLIMENTARY?

A Dynamic Gesture Language and Graphical Feedback for Interaction in a 3D User Interface

Bending vibration measurement on rotors by laser vibrometry

Rapid process planning in CNC machining for rapid manufacturing applications

Cutting tools in finishing operations for CNC rapid manufacturing processes: simulation studies

Building a bimanual gesture based 3D user interface for Blender

Spatial Mechanism Design in Virtual Reality With Networking

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Cognition-based CAAD How CAAD systems can support conceptual design

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Designing with regulating lines and geometric relations

HOW CAN CAAD TOOLS BE MORE USEFUL AT THE EARLY STAGES OF DESIGNING?

Supporting `design for all' in automotive ergonomics

AN APPROACH TO 3D CONCEPTUAL MODELING

The Control of Avatar Motion Using Hand Gesture

CHAPTER 1. INTRODUCTION 16

Chapter 1 - Introduction

Audit culture, the enterprise university and public engagement

Ergonomics analysis in a virtual environment

Failure modes and effects analysis through knowledge modelling

Application of 3D Terrain Representation System for Highway Landscape Design

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Updating to remain the same: Habitual new media [Book Review]

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

A Kinect-based 3D hand-gesture interface for 3D databases

Materials for product design

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Gesture Recognition with Real World Environment using Kinect: A Review

Vocational Training with Combined Real/Virtual Environments

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

Initial solar cell characterisation test and comparison with a LED-based solar simulator with variable flash speed and spectrum

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

DATA GLOVES USING VIRTUAL REALITY

Enabling Cursor Control Using on Pinch Gesture Recognition

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot

HUMAN COMPUTER INTERFACE

A case study analysis of the application of design for manufacture principles by industrial design students

Virtual Grasping Using a Data Glove

Multi-band material loaded Low-SAR antenna for mobile handsets

Eective ecodesign: nding a way forward for industry

3D and Sequential Representations of Spatial Relationships among Photos

RESEARCH. Digital Design - the potential of Computer Aided Designing in design learning environments. Tony Hodgson, Loughborough University, UK

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

CS 315 Intro to Human Computer Interaction (HCI)

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

INTERACTIVE SKETCHING OF THE URBAN-ARCHITECTURAL SPATIAL DRAFT Peter Kardoš Slovak University of Technology in Bratislava

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

VR-MOG: A Toolkit For Building Shared Virtual Worlds

The application of computer-aided design and manufacture in school-based design

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Development of excavator training simulator using leap motion controller

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Optimization of roughing operations in CNC machining for rapid manufacturing processes

of interface technology. For example, until recently, limited CPU power has dictated the complexity of interface devices.

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Voltage-dependent quantum efficiency measurements of amorphous silicon multijunction mini-modules

Performance of high-eciency photovoltaic systems in a maritime climate

Direct gaze based environmental controls

Realtime 3D Computer Graphics Virtual Reality

A Brief Survey of HCI Technology. Lecture #3

Adaptive notch filters from lossless bounded real all-pass functions for frequency tracking and line enhancing

ROBOT DESIGN AND DIGITAL CONTROL

Effect of I-V translations of irradiance-temperature on the energy yield prediction of PV module and spectral changes over irradiance and temperature

Linking science and technology in the primary school

Virtual Prototyping State of the Art in Product Design

Mechatronics Educational Robots Robko PHOENIX

Stitched transmission lines for wearable RF devices

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Antenna frequency and beam reconfliguring using photoconducting switches

Application of Definitive Scripts to Computer Aided Conceptual Design

Creating Practitioners of Design for Quality Through Education

Accessing the performance. light processing projector

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Virtual Reality Calendar Tour Guide

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Virtual Reality and Full Scale Modelling a large Mixed Reality system for Participatory Design

STEM: Electronics Curriculum Map & Standards

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Multi-Modal User Interaction

CAPACITIES FOR TECHNOLOGY TRANSFER

A Machine Tool Controller using Cascaded Servo Loops and Multiple Feedback Sensors per Axis

Programming PIC Microchips

An acoustic emission slope displacement rate sensor: Comparisons with established instrumentation

VISUALISING ERGONOMICS DATA FOR DESIGN

Design and Control of the BUAA Four-Fingered Hand

Communicating Functional Requirements with GD&T

Transcription:

Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE, K. and DOYLE, R., 1994. The use of gestures in computer aided design. IN: Case, K. and Newman, S.T.(eds.) Proceedings of the 10th National Conference on Manufacturing Research - Advances in Manufacturing Technology VIII, Loughborough, England, 5-7 September 1994. London, UK: Taylor and Francis, pp. 506-510. Additional Information: This is a conference paper. Metadata Record: https://dspace.lboro.ac.uk/2134/13515 Version: Accepted for publication Publisher: c Taylor & Francis Please cite the published version.

This item was submitted to Loughborough s Institutional Repository (https://dspace.lboro.ac.uk/) by the author and is made available under the following Creative Commons Licence conditions. For the full text of this licence, please go to: http://creativecommons.org/licenses/by-nc-nd/2.5/

Case, K., & Doyle, R. (1994). The use of gestures in computer aided design. In K. Case, & S. T. Newman (Eds.), 'Advances in Manufacturing Technology VIII', the Proceedings of the Tenth National Conference on Manufacturing Research, NCMR 1994 (pp. 506-510). Loughborough University of Technology, Loughborough, UK, 5-7 September 1994: London, UK: Taylor and Francis. THE USE OF GESTURES IN COMPUTER AIDED DESIGN Dr Keith Case and Robert Doyle Department of Manufacturing Engineering, Loughborough University of Technology, Loughborough, Leicestershire, LE11 3TU Computer aided design systems are particularly useful in detailing, analysis and documentation but are not well-suited to the very early, conceptual aspects of design. This paper describes investigations of novel methods of interfacing between the designer and his computer system using stereotyped gestures to modify dimensional, positional and orientational parameters for simple three-dimensional geometric models. A prototype implementation using a virtual reality visualisation system enhanced by the provision of a six degree of freedom real-time tracking device is described. Introduction Computer aided design (CAD) systems are well-suited to the production of precise and detailed design geometry, can be used for functional analysis and may lead directly into manufacturing planning activities. However, existing CAD systems are not so well-suited to the conceptual aspects of design where the principle objective is to establish general dimensional and spatial relationships in a highly interactive fashion, and where rigidity and precision can be considered to constrain the designer. The manipulation of 3D shapes within CAD systems to either change their shape and dimensions, or to re-locate and re-orientate them in three-dimensional space is computationally simple, but the way in which the designer specifies his intentions is by no means easy and straightforward. Gestural input replaces (in part) the need for rigid command languages, menu systems, or whatever to define the operation required. Gestural input is a relatively recent research topic, probably because the necessary supporting hardware has only arrived with virtual reality techniques. Prime (1993) describes 2D work in the context of text editing (e.g. Welbourn and Whitrow, 1988, and Rhyne, 1987) and describes his own work in the use of data gloves for handtracking in three-dimensional space. Hauptmann and McAvinney (1993) provide a useful review of current work and also describe empirical studies of the effectiveness of gesture for graphic manipulation which reached the important conclusion that there is commonality between subjects in their use of gestures. 1

Geometry Manipulation In the manipulation of 3D shapes two major functions are performed through the user interface: (1) the identification of the object, and (2) the specification of the manipulation. Various techniques are employed to meet the first of these objectives, including the use of a naming scheme, a mouse-driven cursor, etc. The gestural equivalent can be the established methods of lightpens/touch screens or the use of a hand tracking device to determine the location of a 'pointing' finger. The manipulation usually has one of two objectives, (1) to make some precise change to the model, (e.g. scale by two), or (2) to assist the user in some qualitative way. (e.g. to make something a little larger). Most CAD systems, are geared to the first situation while the second situation requires an imprecise method with a rapid graphical feedback. A means of achieving this control is a form of input which is related to the stereotypes of human movement rather than the existing use of linguistic expression of numerical definitions. These gestures must be stereotypes as a system which required the learning of a set of movements is merely a different form of artificial language. Input Interfaces Assuming that a useful set of gestural stereotypes can be identified, then some form of transducer is required so that body movements can be translated into commands for the CAD system. The transducer must be capable of sampling at high rates, and needs to be provided with sufficient computational power for real time processing. Similarly the CAD modelling system must be sufficiently responsive so that the user perceives a real time response to commands- characteristics found in virtual reality (VR) systems. The origins of the work described here lie in some early work (Case & Keeble, 1986) which had the objective of controlling the position, size and orientation of simple objects within a conventional CAD system using human arm movements. This used the UNIMAN posture measuring device (a 'cat-suit' or 'second-skin' garment with in-built strain gauges at the major points of articulation (Paradise, 1980). This work suffered many difficulties including calibration and speed of communication with the host CAD system, but it did demonstrate a potentially useful technique that could be successfully implemented with adequate hardware and software support. VR systems provide simple 3D modelling and effective communication with the user through a variety of input devices. The work described here is based on the Superscape desktop VR system and a Polhemus 3Space Fastrak device for six degree of freedom sensory input. A small magnetic field sensor is mounted on the hand and obtains positional and orientational information relative to a fixed source. A control unit feeds this information in real time to the VR system where it can processed (eg to filter out extraneous motion) and used to control objects in the virtual world. Gestures Gestures are stereotyped and recognisable body postures. For example the 'thumbs-up' sign has almost universal recognition (but beware of cultural differences!). It may be possible to record such gestures in some encoded form such that they can subsequently be 'replayed', and this has been used for example in the Benesh Notation for ballet (Benesh and Benesh, 1969). The desire to understand the meaning of gestures has a long history and Francois Delsarte (1811-1871) defined laws of expression with defined relationships between gestures and their meanings. His Three Great Orders of 2

Movement associated meaning with general characteristics of dynamic gestures such that Oppositions where any two parts of the body moving in opposite directions simultaneously express force, strength and power. Similarly Parallelisms where any two parts of the body moving in the same direction denote weakness, and Successions where any movement passing through the whole body in a fluid wave like motion are the greatest expression of emotion. Specific interpretations of static gestures by parts of the body can be identified, and thus the head raised and turned away can mean pride or revulsion and the head lowered towards a person shows respect. The first of these might be an example of an involuntary reaction whilst the latter has become a formalised and learned part of communication in many cultures. Similarly figure 1 illustrates hand postures and associated meanings. Figure 1. Hand Gestures and Associated Meanings Geometry Manipulation with Gestural Input Initial Shape Generation A substantial aspect of design is in the generation of initial geometric shapes, involving the specification of large amounts of coordinate data, shape descriptors and connectivity relationships. Gestural input at this stage may be feasible by an analogy with physical model building where shapes may be generated by hand modelling either in a totally virtual world or through some intermediary 'sensitive' material (where changes to a modelling material's shape can be sensed and transmitted to the CAD system). However, these aspects are outside the scope of this work. Object Selection Objects are selected by pointing with a stylized model 'hand'. Location of the real hand is tracked in 3D space and the view of the 3D virtual world tracks the hand to avoid the need for separate view control. (Figure 2). Dynamic inteference detection capabilities are used for selection by collision of the hand with the object to be selected. Shape Modification Shape modification can relate to topology or geometry. Topological changes present essentially the same challenges as initial shape generation and in our work shape modification refers simply to dimensional changes. Arm flexion/extension, 3

abduction/adduction and vertical motion are used to modify the dimensions of the object. Filtering the tracking device input limits dimensional change to the predominant direction, so that although dimensions can be changed in all three directions within the one mode of working, the user has adequate control over the activity. (See figure 3). Location Positional changes to objects are achieved in a similar manner to dimensional changes in that the major hand movements give the required direction. Orientation Orientation is achieved by tracking wrist abduction/adduction, flexion/extension and pronation/supination. (See figure 4). This is an example of the need to amplify the motion of the real hand so as to be able to control orientation of the object throughout the range of motion, as the human wrist does not have full freedom of movement. Free Move Combining orientation and location gives full control over the object and effectively the model object will track the real hand position and orientation. Adjacency Figure 2. Object Selection Adjacency refers to the frequently desired ability to position and orientate one object relative to another. A good example would be in kitchen design where a base unit would be butted up to a comer of the room and the floor, and subsequent units would be positioned adjacently. This is achieved by a free move constrained by interference c1iteria which ensure that the selected object cannot be moved into an already occupied space. Mode Control Figure 3. Object Dimensioning An important issue in the use of gestural input is the need to avoid changes in the mode of control and thus view control is implemented as an integral part of the geometry interaction activity, and the ability to switch control modes (say from positional to orientational control) must also be accommodated in a similar fashion. A dynamic menu which is part of the model space and which which tracks the virtual 4

hand position so as to always be available within the 3D space currently in view (figure 5) overcomes this difficulty. Figure 4. Object Orientation Figure 5. Dynamic Menu Conclusions There is a long way to go before gestural input can be considered as a viable method for interacting with CAD systems, and it should be realised that there are many situations for which the inherent imprecision would be inappropriate. However, this work has demonstrated the feasibility of the approach and further research is required to determine its acceptability to users. Acknowledgements This research was funded by the ACME Directorate of the Science and Engineering Research Council. (Grant Number GR/122399, 'Gestural Input for Conceptual Design'). References Benesh, R. and Benesh, J. 1969, An Introduction to Benesh Movement Notation: Dance, Dance Horizons. Case, K. and Keeble, J. 1986, Gestural Input to a CAD System, Loughborough University of Technology. Case, K., 1993, Gestural Input for Conceptual Design, Proceedings of ACME Research Conference, Sheffield University. Hauptmann, A.G. 1993, Gestures with speech for graphic manipulation, International Journal of Man-Machine Studies, 38, 231-249. Prime, M. 1993, Hand tracking devices in complex multimedia environments, Workshop on 3D Visualisation in Engineering Research, Rutherford and Appleton Laboratory. Paradise, M. 1980, Recording Human Posture, PhD Thesis, Department of Production Engineering and Production Management, University of Nottingham. Rhyne J. 1987, Dialogue management for gestural interfaces, Computer Graphics, 21. Shawn, T. 1974, Every Little Movement, Dance Horizons. Welbourn, K. and Whitrow, R. 1988, A gesture based text editor, People and Computing IV, Cambridge University Press, Cambridge. 5