Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau.

Similar documents
2. Introduction to Computer Haptics

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Haptic Rendering CPSC / Sonny Chan University of Calgary

Haptics CS327A

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

From Encoding Sound to Encoding Touch

Proprioception & force sensing

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Booklet of teaching units

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Feeding human senses through Immersion

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

Peter Berkelman. ACHI/DigitalWorld

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Computer Haptics and Applications

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Transformer Engineering

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

Intelligent Systems, Control and Automation: Science and Engineering

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Haptic interaction. Ruth Aylett

CSC 2524, Fall 2017 AR/VR Interaction Interface

Realtime 3D Computer Graphics Virtual Reality

Virtual Reality. NBAY 6120 April 4, 2016 Donald P. Greenberg Lecture 9

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

Design Process. ERGONOMICS in. the Automotive. Vivek D. Bhise. CRC Press. Taylor & Francis Group. Taylor & Francis Group, an informa business

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Behavioural Realism as a metric of Presence

Application of 3D Terrain Representation System for Highway Landscape Design

An Introduction into Virtual Reality Environments. Stefan Seipel

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

Virtual Reality: Principles and Applications

Virtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Overview of current developments in haptic APIs

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

FOCUS COMPUTER ENGINEERING SERIES. Eyestrain Reduction. Laure Leroy

Chapter 1 - Introduction

Geographic information systems and virtual reality Ivan Trenchev, Leonid Kirilov

Design. EMI Filter. Timothy THIRD EDITION. Richard Lee Ozenbaugh. M. Pullen. CRC Press. Taylor & Francis Croup. Taylor & Francis Croup,

CSE 165: 3D User Interaction. Lecture #11: Travel

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

VR/AR Concepts in Architecture And Available Tools

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Haptic interaction. Ruth Aylett

Output Devices - Visual

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

Integrated Power Electronic Converters and Digital Control

Geo-Located Content in Virtual and Augmented Reality

Advancements in Gesture Recognition Technology

Virtual Environments. Ruth Aylett

Sensors for Mechatronics

COMS W4172 Design Principles

VR System Input & Tracking

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

The use of gestures in computer aided design

Virtual and Augmented Reality: Applications and Issues in a Smart City Context

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

3D Space Perception. (aka Depth Perception)

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Input devices and interaction. Ruth Aylett

R (2) Controlling System Application with hands by identifying movements through Camera

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Output Devices - Non-Visual

Using Web-Based Computer Graphics to Teach Surgery

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Haptic Rendering of Large-Scale VEs

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Mobile Broadband Multimedia Networks

Trends & Milestones. History of Virtual Reality. Sensorama (1956) Visually Coupled Systems. Heilig s HMD (1960)

Abstract. 1. Introduction

The Application of Virtual Reality Technology to Digital Tourism Systems

TABLE OF CONTENTS CHAPTER TITLE PAGE DECLARATION DEDICATION ACKNOWLEDGEMENT ABSTRACT ABSTRAK

Air-filled type Immersive Projection Display

VR based HCI Techniques & Application. November 29, 2002

IVR: Sensing Self-Motion 26/02/2015

Intelligent. Systems. Mobile. Autonomous. Jitendra R. Raol Ajith K. Gopal. CRC Press. Taylor & Francis Croup

Phased Array Antennas

An Immersive Virtual Reality Training System for Mechanical Assembly

Enhancing Robot Teleoperator Situation Awareness and Performance using Vibro-tactile and Graphical Feedback

Lecture IV. Sensory processing during active versus passive movements

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

History of Virtual Reality. Trends & Milestones

Haptic presentation of 3D objects in virtual reality for the visually disabled

Sensation. Our sensory and perceptual processes work together to help us sort out complext processes

Haptic Rendering and Volumetric Visualization with SenSitus

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

POWER- SWITCHING CONVERTERS Medium and High Power

3D Interaction Techniques

Chapter 1 The Military Operational Environment... 3

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

FORCE FEEDBACK. Roope Raisamo

Six degree of freedom active vibration isolation using quasi-zero stiffness magnetic levitation

Transcription:

Virtual Reality: Concepts and Technologies Editors Philippe Fuchs Ecole des Mines, ParisTech, Paris, France Guillaume Moreau Ecole Centrale de Nantes, CERMA, Nantes, France Pascal Guitton INRIA, University of Bordeaux I, Bordeaux, France CRC Press >V J Taylor 8k Francis Group Boca Raton London NewYork Leiden CRC Press is an imprint of the Taylor & Francis Croup, an informa business A BALKEMA BOOK

Table of Contents Preface About the editors List of authors The French Association for Virtual Reality and Mixed Reality xv xvii xix xxi SECTION I Introduction 1 Introduction to virtual reality 3 1.1 Foundation of virtual reality 3 1.1.1 Introduction 3 1.1.2 Definitions of virtual reality 5 1.1.2.1 Origin and simplistic image of virtual reality 5 1.1.2.2 Purpose of virtual reality 6 1.1.2.3 Functional definition 7 1.1.2A Technical definition 7 1.2 Book outline 9 Bibliographic references 10 2 Theoretical and pragmatic approach to virtual reality 11 2.1 Human behaviour in a real environment 11 2.2 Behavioural interfaces 12 2.2.1 Hardware design 12 2.2.2 Transparency of an interface 13 2.2.3 Commercial interfaces and custom interfaces 15 2.3 "Instrumental" approach for immersion and interaction 16 2.3.1 Fundamental concepts of behavioural interfacing 16 2.3.2 Behavioural interfaces, schema and metaphors 19 2.3.2.1 Concept of schema 19 2.3.2.2 Use of schemas, metaphors or sensorimotor substitutions 21 2.3.3 Consistency and discrepancy of virtual environment 22 2.3.4 Interface and multimodality 23 2.4 Method of designing and assessing a virtual reality environment 24 2.4.1 VR reference model 24 2.4.2 Virtual behavioural primitives 26

vi Table of Contents 2.4.3 Behavioural Software Aids 27 2.4.3.1 Sensorimotor Software Aids 27 2.4.3.2 Cognitive Software Aids 28 2.4.4 Design approach 29 2.4.5 Assessment approach 32 2.5 Examples of designing and a assessing virtual reality environment 34 2.5.1 Virtual shop for experimentation 34 2.5.1.1 Introduction 34 2.5.1.2 Analysis of the problem based on our general diagram of VR 34 2.5.1.3 Visual observation of products 35 2.5.1.4 Natural handling of 3D products with 6DOF 37 2.5.1.5 Navigation in the shop 37 2.5.2 Training on railway infrastructure using virtual reality 39 2.5.2.1 Analysis of the problem on the basis of our general VR diagram 39 2.5.2.2 2D movement on railway tracks 40 2.5.2.3 Orientation on tracks 41 2.5.2.4 Visual immersion 41 2.5.2.5 Natural handling of objects in 3D with 3DOF 42 2.6 Discussion on our approach for the subject's immersion and interaction 42 2.7 Perspectives and conclusions 44 Bibliographic references 44 SECTION II The human being in virtual environments 3 Human senses 3.1 Introduction 49 3.2 Vision 51 3.2.1 The human visual system 52 3.2.1.1 The entire visual system 52 3.2.1.2 The eye 53 3.2.1.3 Accommodation and convergence 53 3.2.1.4 The retina 54 3.2.1.5 The concept of spatial frequency 56 3.2.2 Visual perception of depth 57 3.2.2.1 Cognitive perception by monocular cues 57 3.2.2.2 Convergence and retinal disparity 60 3.2.2.3 Binocular vision and diplopia 62 3.2.2.4 Neurophysiological mechanisms of the perception of depth 63 3.2.3 Psychophysical characteristics of vision 63 3.2.3.1 Light sensitivity 64 3.2.3.2 Frequency sensitivities 64 3.2.3.3 Visual acuity 65 3.2.3.4 Field of vision 67 49

Table of Contents vii 3.2.3.5 Maximum temporal frequency in vision 68 3.2.3.6 Psychophysical characteristics of stereoscopic vision 68 3.2.3.7 Colour discrimination 70 3.2.3.8 Field dependence-independence 70 3.3 Cutaneous sensitivity 70 3.3.1 The skin 70 3.3.2 Classification of biological sensors 71 3.3.2.1 Nociceptors 71 3.3.2.2 Thermoreceptors 71 3.3.2.3 Mechanoreceptors 73 3.4 Proprioception 76 3.4.1 Introduction 76 3.4.2 Physics of gravity and accelerations 76 3.4.3 Vestibular apparatus and kinaesthetic canals 76 Bibliographic references 79 4 Interaction between virtual reality and behavioural sciences 81 4.1 Introduction 81 4.2 Contribution of virtual reality to behavioural sciences 82 4.2.1 Basic research 82 4.2.2 Applied research 84 4.2.2.1 Training, learning and simulation 84 4.2.2.2 Therapy and rehabilitation 85 4.2.2.3 Visualization in scientific computing 85 4.3 Contribution of behavioural sciences to virtual reality 86 4.3.1 What are the correct parameters? 86 4.3.2 Realism 87 4.3.3 The concept of "real time" 88 4.4 Conclusion 89 Bibliographic references 90 5 Immersion and presence 93 5.1 Introduction 93 5.2 Immersion 94 5.2.1 Sensory richness 94 5.2.2 Interaction 95 5.2.3 Structural factors of immersion 95 5.2.3.1 Coherence 96 5.2.3.2 Mapping 96 5.3 Presence 97 5.3.1 Questionnaires and subjective measurements 97 5.3.2 Physiological measurements 98 5.3.3 Behavioural measurements 98 5.3.3.1 Performance 98 5.3.3.2 Reflex actions 98 5.3.3.3 Sensorimotor control 99

viii Table of Contents yy 5.4 Conclusion Bibliographic references 100 SECTION III 6 Location sensors 6.1 Introduction Behavioural interfaces 1 5 105 6.1.1 Spatial location 105 6.1.2 Location sensor and command interface 106 6.2 Mechanical trackers 107 6.2.1 Mechanical trackers measuring distances 107 6.2.2 Mechanical trackers determining an orientation, speed or acceleration 107 6.2.2.1 Inclinometers 108 6.2.2.2 Gyroscopes and rate gyros 108 6.2.2.3 Accelerometers 109 6.3 Electromagnetic trackers 109 6.3.1 Electromagnetic trackers using alternating magnetic field 109 6.3.2 Electromagnetic trackers using impulsive field 111 6.3.3 Characteristics of electromagnetic trackers 112 6.3.4 Compass 113 6.4 Optical trackers 113 6.4.1 Introduction 113 6.4.2 Principle 114 6.4.3 Classification of trackers 115 6.4.4 Some recently launched systems 116 6.4.5 Conclusion 120 Bibliographic references 120 7 Manual motor interfaces 123 7.1 Introduction 123 7.1.1 Location sensor and dataglove 123 7.1.2 Location sensor and command interface 123 7.2 Data gloves 124 7.2.1 Fibre optic gloves 124 7.2.2 Detection of hand movements by cameras 126 7.2.3 Resistance variation gloves 127 7.2.4 Hall effect gloves 128 7.2.5 Special case: binary command glove 129 7.2.6 Conclusion 129 7.3 Command interfaces 130 7.3.1 3D Mouse 131 7.3.2 3D Mouse with force feedback 132 7.3.3 Six degrees of freedom command interface for a large screen 134 7.3.4 Non-manual command interfaces 135 Bibliographic references 136

Table of Contents ix Hardware devices of force feedback interfaces 137 8.1 Introduction 137 8.2 Problems and classification of force feedback interfaces 137 8.3 Design of the force feedback interfaces 140 8.3.1 Performance criteria and specifications 140 8.3.1.1 Concept of transparency 140 8.3.1.2 Necessity of specifications 141 8.3.1.3 Posture and type of grip 141 8.3.1.4 Work space and position resolution 142 8.3.1.5 Static capacity and force resolution 143 8.3.1.6 Dynamics, stiffness, inertia and bandwidth 145 8.3.1.7 Report 145 8.3.2 Modelling and dimensioning 146 8.3.2.1 Problem 146 8.3.2.2 Methods and tools 146 8.3.2.3 Optimisation 150 8.3.3 Technical constraints 151 8.3.3.1 Mechanical architecture of the force feedback interface 151 8.3.3.2 Motorisation 152 8.3.3.3 Reduction stages 153 8.3.3.4 Transmissions 154 8.3.3.5 Balancing 154 8.4 The different force feedback interfaces 154 8.4.1 External reaction force feedback interfaces 154 8.4.1.1 The fixed interfaces with serial structure 154 8.4.1.2 The parallel structure fixed interfaces 157 8.4.1.3 Fixed interfaces with tight ropes 163 8.4.1.4 Fixed interfaces with magnetic levitation 165 8.4.2 Internal reaction force feedback interfaces 165 8.4.2.1 Generic portable interfaces 166 8.4.2.2 Portable interfaces for hand 167 8.4.2.3 Exoskeletons for the hand 169 8.4.2.4 Exoskeletons for the arm 170 8.5 Report 172 Bibliographic references 173 Control of a force feedback interface 179 9.1 Introduction 179 9.2 Intuitive description of the haptic coupling 181 9.3 Modelling of the haptic command by a network formalism 183 9.3.1 Passivity 184 9.3.2 Stability 185 9.3.3 Application to the single degree of freedom problem 186 9.4 Conclusion 188 9.5 Annexe; Elements of network theory 188 Bibliographic references 190

x Table of Contents 10 Tactile feedback interfaces 191 10.1 Introduction 191 10.2 Advantage of tactile feedback interfaces in virtual reality 192 10.3 Designing basics for a tactile interface 193 10.4 State of the art of the tactile interfaces 194 10.4.1 Tactile stimulation technologies 195 10.4.2 Classification of tactile interfaces according to the domain of application 196 10.4.2.1 Tactile interfaces for teleoperation and telepresence 197 10.4.2.2 Tactile interfaces dedicated to the studies of tactile perception 198 10.4.2.3 Tactile interfaces for sensory substitution 202 10.4.2.4 Tactile interfaces for the generation of a 3D surface 202 10.4.2.5 Braille interfaces for the visually impaired 203 10.5 State-of-the-art summary 204 10.6 Conclusion 205 Bibliographic references 206 11 Visual interfaces 211 11.1 Introduction to visual interfaces 211 11.2 Visual interfaces with fixed support 212 11.2.1 Monoscopic computer screens 212 11.2.2 Display of stereoscopic images on a single plane 213 11.2.2.1 Separation at the screen level 213 11.2.2.2 Separation by eyeglasses 214 11.2.3 Large screen projection systems 217 11.2.3.1 Multiple projector architecture 217 11.2.3.2 Distribution of rendering from multiple PCs 218 11.2.3.3 Calibration 220 11.2.3.4 Stereoscopy 222 11.2.3.5 Multi-user stereoscopy 223 11.2.3.6 Different types of projectors 223 11.2.3.7 Passive screens for video projection 225 11.2.3.8 Stereoscopic flat screens 226 11.2.3.9 Connected hardware motor interfaces 227 11.2.4 Examples of large screen projection systems 227 11.2.4.1 Visiodesks or immersive desks 227 11.2.4.2 Human scale visual interfaces: visioroom (immersive room) and visiocube 229 11.3 Portable visual interfaces 234 11.3.1 Architecture of a head-mounted display 235 11.3.2 Head-mounted displays with cathode tube screens 236 11.3.3 Head-mounted displays with liquid crystal screens 237 11.3.4 Optical model of a head-mounted display and related problems 237

Table of Contents xi 11.3.4.1 Problems in the visual quality of a head-mounted display 237 11.3.5 Video eyeglasses 240 11.3.5.1 Video-eyeglasses with LCD screen 240 11.3.6 Head-mounted display and semi-transparent device 240 11.4 Conclusion 242 11.5 Annexe 242 11.5.1 Restitution by volumetric images 242 Bibliographic references 243 12 Interaction techniques for virtual behavioural primitives 247 12.1 Introduction 247 12.1.1 Reminder of our approach on virtual reality 247 12.1.2 Interaction 248 12.2 Virtual behavioural primitives of observation 249 12.2.1 Classification 249 12.2.2 Visual observation 249 12.2.3 Acoustic observation 252 12.2.4 Tactile observation 253 12.3 Wayfinding 253 12.3.1 Introduction 253 12.3.2 Theoretical foundations 254 12.3.2.1 Cognitive map 254 12.3.2.2 Egocentric and exocentric strategies 255 12.3.2.3 Decision-making 255 12.3.3 Wayfinding in a virtual environment 256 12.3.3.1 Characteristics of the virtual world 256 12.3.3.2 Copying the real world 258 12.3.3.3 Addition of software aids 259 12.3.4 Conclusion 262 12.4 Movement 262 12.4.1 Introduction 262 12.4.2 Continuous control 264 12.4.2.1 Movement of the person in the world 264 12.4.2.2 Movement of the world in relation to the person 268 12.4.2.3 Movement of the viewpoint 269 12.4.3 Discrete control 269 12.4.4 Programmed control 270 12.4.5 Evaluations 270 12.4.6 Conclusion 271 12.5 Selection and manipulation 271 12.5.1 Introduction 271 12.5.2 Interaction techniques 272 12.5.3 Accuracy 276 12.5.3.1 Virtual object positioning 277 12.5.3.2 Rotation of a virtual object 278 12.5.3.3 Conclusion 278

xii Table of Contents 12.6 Application control and text input 278 12.6.1 Application control 278 12.6.2 Text input 285 12.6.2.1 Keyboard 285 12.6.3 Conclusion 287 Bibliographic references 288 13 Stereoscopic restitution of vision 293 13.1 Creation of stereoscopic images 293 13.1.1 Principle 293 13.1.2 Choice of stereoscopic parameters 299 13.1.3 Creation of 3D images for teleoperation 300 13.1.3.1 Stereoscopic visual telepresence 300 13.1.3.2 Study of stereoscopic vision 300 13.1.3.3 Deductions of constraints 302 13.1.3.4 Limitation of stereoscopic vision 302 13.1.4 Limitation of visual strain in stereoscopic vision 303 13.1.4.1 Problem of visual strain 303 13.1.4.2 Frequency filtering method 304 13.1.4.3 Experimental results 305 13.1.4.4 Conclusion 306 13.1.5 Creation of images in orthoscopic vision for a design review 306 13.2 Evaluation of stereoscopic techniques 307 13.2.1 Advantages of stereoscopic vision 307 13.2.2 Choice of parameters of stereoscopic vision 307 13.3 Conclusion 308 13.4 Annexe 309 13.4.1 3D Perception on a sheet 309 Bibliographic references 309 SECTION IV Tools and models for virtual environments 14 Geometric models of virtual environments 313 14.1 Introduction 313 14.1.1 Types of objects 314 14.1.2 Properties of models 316 14.2 Solid models 316 14.2.1 Spatial enumeration 317 14.2.2 Constructive solid geometry 319 14.3 Surface models 321 14.3.1 Using plane surfaces 322 14.3.2 Using non-planar surfaces 322 14.3.3 Nurbs surfaces 323 14.4 Algorithmic geometry 326 14.4.1 Transformation of a volume into surface 327

Table of Contents xiii 14.4.2 Polygonal meshing of a scatter plot 328 14.4.2.1 Methods of spatial subdivision 329 14.4.2.2 Distance function methods 330 14.4.2.3 Deformation methods 330 14.4.2.4 Surface expansion methods 331 14.4.3 Decimation of meshes 331 14.4.3.1 Incremental algorithms 332 14.4.3.2 Operators 332 14.4.3.3 Error metrics 334 14.5 Optimisation of models for virtual reality 334 14.5.1 Texturing 334 14.5.1.1 Introduction 334 14.5.1.2 Advantages and disadvantages of textures 335 14.5.2 Levels of details 336 14.5.2.1 Transition command 336 14.5.2.2 Generating the levels of detail 337 Bibliographic references 338 15 Models for visual rendering 339 15.1 Rendering for virtual reality 339 15.1.1 Introduction 339 15.1.2 Real-time rendering 339 15.1.3 Quality and perception 340 15.2 Lighting and shading models 341 15.2.1 Modelling the appearance 341 15.2.1.1 Bidirectional reflectance distribution function 342 15.2.1.2 Textures and bidirectional texture functions 346 15.2.2 Modelling the lighting 350 15.2.2.1 Global illumination and virtual reality 352 15.2.2.2 Local illumination and virtual reality 354 15.3 Rendering and perception 356 15.3.1 Vision models and rendering calculations 356 15.3.1.1 Vision models 356 15.3.1.2 Algorithms of perceptual rendering 358 15.3.2 Tone mapping 359 15.3.2.1 Introduction 359 Bibliographic references 361 16 Models for haptic rendering 367 16.1 Haptic simulation/device coupling 367 16.2 Calculation of haptic rendering 370 16.2.1 Rendering by impedance patterns: calculation of forces 370 16.2.2 Rendering by admittance patterns: calculations of constraints 371 16.2.3 Models primitive to object models (PROXY) 372 16.2.3.1 Principle 372

xiv Table of Contents 16.2.3.2 Implementation 373 16.2.3.3 Benefits of virtual proxy 374 16.2.4 Modelling the environment for haptic rendering 375 16.3 Frequency adaptation 376 16.3.1 Intermediate representations 377 16.4 Haptic libraries 380 16.5 Conclusion 380 Bibliographic references 381 17 Collision detection 383 17.1 Detection of collision between primitives 383 17.1.1 Definition of collision 384 17.1.2 Spatial detection between convex polyhedrons 384 17.1.3 Spatial detection between any polyhedrons 387 17.1.4 Temporal approaches 390 17.1.4.1 Discrete temporal methods 390 17.1.4.2 Continuous temporal detection 391 17.1.5 Assessment of detection of collision between objects and open problems 394 17.2 Detection pipeline 394 17.2.1 Problem 394 17.2.2 Proximity search (broad-phase) 395 17.2.2.1 Strategies of detection by division of the space 395 17.2.2.2 Strategies of detection by topology and kinematics 396 17.2.3 Approximate detection (narrow-phase) 398 17.2.3.1 Strategies of detection by bounding volumes 398 17.2.3.2 Strategies using graphic hardware 401 17.2.4 Continuous temporal acceleration 402 17.2.5 Summary of acceleration 403 17.3 Processing the collision 404 17.4 Conclusion 405 Bibliographic references 405