Input devices and interaction. Ruth Aylett

Size: px
Start display at page:

Download "Input devices and interaction. Ruth Aylett"

Transcription

1 Input devices and interaction Ruth Aylett

2 Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote

3 Why is it important? Interaction is basic to VEs We defined them as interactive in real-time No interaction => NOT a VE Ideal interaction: Very low latency - i.e fast Multi-modal Unencumbered Intuitive Technology falls well short of this of course

4 Tracking the human body Large displays require position and orientation of viewer s body to be tracked tracking information fed to runtime system as input signal. Most commonly tracked is head but sometimes also hands, arms, legs, eyes etc. Head tracking used to update virtual viewpoint orientations. Body tracking needed for lifelike interaction with objects and creatures. say user wishes to wave at another person in the VE: their real-world motions can be tracked and replicated in the VE.

5 Interaction types Navigation Staying on the ground? Walking v flying Depends on size of model wrt display system Degree of immersion Interaction with other users Gesture Interaction with objects Depends on the object and interaction Select, lift, rotate, throw, steer, hit

6 Virtual Tennis Movie Virtual Tennis

7 Tracking the human head An essential basic requirement in immersive VR systems. Imagine axes mounted on top of your head pans, tilts and yaws of head measured around those axes. HMDs often have rotation sensors to measure these three angles. Angles passed to run-time VR software which updates viewing angles. HMD

8 Tracking devices Many tracking devices and systems developed over the years some aimed specifically at VR systems others borrowed from other areas. Some systems are portable and cheap - some require permanent installations in large rooms and are very expensive indeed. Trackers can be magnetic, electro-magnetic, acoustic, inertial, optical, or mechanical. Electro-magnetic trackers transmitter generates electromagnetic signals received by a receiver (or sensor). Signal strength used to determine absolute position and orientation of receiver relative to transmitter.

9 Example: Polhemus FASTRAK FASTRAK electro-magnetic sensor from Polhemus accurately computes the position and orientation of tiny receiver as it moves through space. Dynamic, real time six degree-of-freedom measurement of position (X, Y, and Z) and orientation (yaw, pitch, and roll) RS-232 signal updated at 120 records/sec. Transmitter constantly puts out a weak magnetic field. passive receiver generates an electric signal as it is moved through the field. Polhemus' processing electronics then amplify and analyse this signal to determine the real-world position and orientation of the receiver relative to the transmitter.

10 Polhemus FASTRAK system Polhemous trackers well proven and widely used since the very early 1990 s. The FASTRAK system shown here has one receiver and one transmitter. System expanded by adding up to three more receivers can attach receivers to different parts of body log data for gait and limb analysis or computer animation.

11 Electromagnetic Tracking Polhemus

12 Electromagnetic Tracking Ascension Ascension market a number of systems based on DC rather than AC fields including Flock of Birds and a full gait analysis system called MotionStar.

13 Electromagnetic Tracking Advantages Small receivers Reasonably cheap Line-of-sight (LOS) not required Disadvantages Accuracy diminishes with distance Not very large working volume High latency due to filtering Transmitter/receiver required

14 Electro-magnetic interference Major problem of electro-magnetic trackers magnetic fields easily affected by the surrounding environment. Large metal objects produce eddy currents in the presence of the magnetic fields These can interfere and distort the original signal causing inaccurate measurements. same effect appears near electric currents, such as in cabling also ferromagnetic materials Also electromagnetic sources such as computer monitors. Ferromagnetic and/or metal surfaces cause field distortion

15 Ultrasonic trackers Two main components transmitter generating an ultrasound signal receiver detecting the signal. Distance is calculated by measuring time-of-flight of ultrasonic pulse. Three transmitters and receivers needed to calculate full 3D position and orientation. Ultrasonic tracking used by Logitech Head Tracker (shown) and 3D mouse.

16 Ultrasonic trackers The Power Glove made by toy company Mattel (who make Barbie) introduced in 1989 for use with the Nintendo Entertainment System (NES). Ultrasonic device for use in place of standard Nintendo controllers Detected finger motion Plus full set of buttons on the wrist. In fact not much use for Nintendo gamers But amazingly advanced piece of VR kit for its time.

17 Acoustic Tracking Advantages Well known transducers (mics), lightweight Low cost device Disadvantages Line-of-sigh (LOS) required Echoes Low accuracy (speed of sound in air varies) Transmitter/receiver required

18 Inertial tracking systems Very popular (because cheap) based on inertial gyro technology Detects acceleration and thus can calculate velocity (since mass in known) giving 3DoF Newish example is the Intersense IS-300. Can be coupled with add-on ultrasonic system to give 6 DoF sensing example of a hybrid technology tracker. IS-300 can operate in metallic environments, 6 DoF tracker operates only in LoS of transmitter. Other examples: Intersense Intertrax2 and the Ascension 3D-Bird.

19 Inertial Tracking Advantages Cheap Small size No transmitter/receiver required LOS not required Disadvantages Only 3 DOF on their own Drift Not accurate for slow movements

20 Optical tracking methods Many different forms Often use image processing and pattern recognition and matching Much work outside of VR: numerous ideas suitable for tracking object position and pose For example fiducial mark detection light sources or reflective colour markers attached to object at important locations such as joints or extremities. Easier for image processing algorithm to track in cluttered conditions.

21 How it is done

22 Optical tracking methods Outside-in tracker tracking apparatus is fixed object to be tracked (e.g. the user) is viewed from the "outside". Inside-out systems take tracking measurements from the object to be tracked for instance a camera can be mounted on the HMD images analysed to produce pose and distance estimations based on the position of fixed patterns within the environment. Visible images or infra-red used. Many optical systems (but not all!) are one-offs, expensive and require careful calibration procedures.

23 Infra-red cameras

24 Optical Tracking Advantages Can work over a large area. Inherently wireless Disadvantages LOS needed Transmitter/receiver required Expensive Requires computer vision technology

25 Eye trackers Eye tracking systems are examples of optical tracking devices. viewpoint in the virtual world follows the gaze of user s eye. Originally developed as a mouse replacement simply look at object interact through eye movement (such as a slow blink). Support physically impaired users. Combined eye and head tracking systems also exist - use in practice is complicated.

26 Mechanical trackers Mechanical linkage system arm-like structure of several joint, one end fixed, the other free to move with the user. Measure position and angular orientation of free end by measuring angles at each joint and factoring in length of each segment. Fake Space BOOM (right)

27 Mechanical Tracking Advantages Simple sensors, no need for transmitter/receiver low-cost device very low latency High positional accuracy Disadvantages The user is tethered Lots of inertia Typically small working volume Mechanical parts wear out

28 Unencumbered tracking Depends on identifying hand/hand on video One approach is using blobs

29 Cybergloves and similar Inherent in the folklore and hype of VR is the cyberglove - a wearable device that monitors the the position and orientation of hand and fingers. The name CYBERGLOVE is registered by Virtual Technologies Inc (VTi). uses 18 or 22 patented angular sensors for tracking the position of fingers and hand.

30 Virtual Technologies CyberGlove - 18-sensor model - 22-sensor model Variants are: - CyberTouch - CyberGrasp Gloves

31 Gloves Fifth Dimensions Technologies - Data Glove Data Glove finger flexure hand orientation -roll & pitch

32 Gloves Fakespace - Pinch Glove Pinch Glove gesture recognition reliable low cost electrical sensors in each fingertip contact among any 2 or more digits

33 Mouse as input device in VR Normal 2D mouse can be used (as in Cortona for example). Need user selectable modes to switch between DoF s. More sophisticated mice provide 3 or more DoF: these include the Spaceball (shown here) and Spacemouse. Standard games joysticks or gamepads also used to give 2 or more DoF s.

34 6 DOF Mice 3 translation DOF 3 rotation DOF

35 6 DOF Mice Spaceball by Labtec Spacemouse by DLR (Logitech - USA)

36 6 DOF Mice Cyberpuck SpaceOrb

37 The WiiMote 3 accelerometers Enough for 6 DOF But will drift Bluetooth connection to 10m Optical (IR) sensor To 5m from sensor bar Triangulation from ends of bar Allows accurate pointing Speaker

38 WiiMote interaction Head-tracking WiiMote stationary, head-mounted IR source Finger-tracking - touch-free interaction IR tape on finger + fixed IR source Gesture recognition Using accelerometers Feature classification Fast movements work better; beware variable arm orientation

39 Software Free libraries WiiGLE Provides a set of classifiers WiiGee Java-based, one classifier Issues with Bluetooth stacks Flakey implementations, especially Vista BlueSoleil seems a good driver

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote, Kinect Contents Why is it important? Interaction is basic to VEs We defined them as interactive

More information

VR System Input & Tracking

VR System Input & Tracking Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics Virtual Reality Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP:

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Virtual Environments: Tracking and Interaction

Virtual Environments: Tracking and Interaction Virtual Environments: Tracking and Interaction Simon Julier Department of Computer Science University College London http://www.cs.ucl.ac.uk/teaching/ve Outline Problem Statement: Models of Interaction

More information

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Morteza Ghazisaedy David Adamczyk Daniel J. Sandin Robert V. Kenyon Thomas A. DeFanti Electronic Visualization Laboratory (EVL) Department

More information

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS What 40 Years in Simulation Has Taught Us About Fidelity, Performance, Reliability and Creating a Commercially Successful Simulator.

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System

A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System FOR U M Short Papers A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System Abstract Results of a comparison study of the tracking accuracy of two commercially

More information

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world. Sensing Key requirement of autonomous systems. An AS should be connected to the outside world. Autonomous systems Convert a physical value to an electrical value. From temperature, humidity, light, to

More information

A Survey of Hand Posture and Gesture Recognition Techniques and Technology

A Survey of Hand Posture and Gesture Recognition Techniques and Technology ASurvey of Hand Posture and Gesture Recognition Techniques and Technology Joseph J. LaViola Jr. Department of Computer Science Brown University Providence, Rhode Island 02912 CS-99-11 June 1999 A Survey

More information

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor: UNDERGRADUATE REPORT Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool by Walter Miranda Advisor: UG 2006-10 I R INSTITUTE FOR SYSTEMS RESEARCH ISR develops, applies

More information

Head Tracking for Google Cardboard by Simond Lee

Head Tracking for Google Cardboard by Simond Lee Head Tracking for Google Cardboard by Simond Lee (slee74@student.monash.edu) Virtual Reality Through Head-mounted Displays A head-mounted display (HMD) is a device which is worn on the head with screen

More information

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots CENG 5931 HW 5 Mobile Robotics Due March 5 Sensors for Mobile Robots Dr. T. L. Harman: 281 283-3774 Office D104 For reports: Read HomeworkEssayRequirements on the web site and follow instructions which

More information

Range Sensing strategies

Range Sensing strategies Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

DATA GLOVES USING VIRTUAL REALITY

DATA GLOVES USING VIRTUAL REALITY DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This

More information

CSE 165: 3D User Interaction. Lecture #11: Travel

CSE 165: 3D User Interaction. Lecture #11: Travel CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment

More information

Augmented and Virtual Reality

Augmented and Virtual Reality CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS

More information

Development of intelligent systems

Development of intelligent systems Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic

More information

IMGD 4000 Technical Game Development II Interaction and Immersion

IMGD 4000 Technical Game Development II Interaction and Immersion IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic

More information

MEASURING AND ANALYZING FINE MOTOR SKILLS

MEASURING AND ANALYZING FINE MOTOR SKILLS MEASURING AND ANALYZING FINE MOTOR SKILLS PART 1: MOTION TRACKING AND EMG OF FINE MOVEMENTS PART 2: HIGH-FIDELITY CAPTURE OF HAND AND FINGER BIOMECHANICS Abstract This white paper discusses an example

More information

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005. Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Haptic, vestibular and other physical input/output devices

Haptic, vestibular and other physical input/output devices Human Touch Sensing - recap Haptic, vestibular and other physical input/output devices SGN-5406 Virtual Reality Autumn 2007 ismo.rakkolainen@tut.fi The human sensitive areas for touch: Hand, face Many

More information

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces Content based on Dr.LaViola s class: 3D User Interfaces for Games and VR What is a User Interface? Where

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information

EnSight in Virtual and Mixed Reality Environments

EnSight in Virtual and Mixed Reality Environments CEI 2015 User Group Meeting EnSight in Virtual and Mixed Reality Environments VR Hardware that works with EnSight Canon MR Oculus Rift Cave Power Wall Canon MR MR means Mixed Reality User looks through

More information

Recent Progress on Wearable Augmented Interaction at AIST

Recent Progress on Wearable Augmented Interaction at AIST Recent Progress on Wearable Augmented Interaction at AIST Takeshi Kurata 12 1 Human Interface Technology Lab University of Washington 2 AIST, Japan kurata@ieee.org Weavy The goal of the Weavy project team

More information

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK

3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK 3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK 2008 3D INTERACTION DESIGN AND APPLICATION DEVELOPMENT LIM KIAN TECK SCHOOL OF MECHANICAL AND AEROSPACE ENGINEERING 2008 3D INTERACTION DESIGN

More information

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space Vorlesung Mensch-Maschine-Interaktion LFE Medieninformatik Ludwig-Maximilians-Universität München http://www.hcilab.org/albrecht/ Chapter 4 3.7 Design Space for Input/Output Slide 2 The solution space

More information

Technical Report Virtual Reality Tracking System. John P. Baker, Andrew P. Paplinski, Member, IEEE. November 4, 1994

Technical Report Virtual Reality Tracking System. John P. Baker, Andrew P. Paplinski, Member, IEEE. November 4, 1994 Faculty of Computing and Information Technology Department of Robotics and Digital Technology Technical Report 94-12 Virtual Reality Tracking System John P. Baker, Andrew P. Paplinski, Member, IEEE November

More information

virtual reality SANJAY SINGH B.TECH (EC)

virtual reality SANJAY SINGH B.TECH (EC) virtual reality SINGH (EC) SANJAY B.TECH What is virtual reality? A satisfactory definition may be formulated like this: "Virtual Reality is a way for humans to visualize, manipulate and interact with

More information

Time of Flight Capture

Time of Flight Capture Time of Flight Capture CS635 Spring 2017 Daniel G. Aliaga Department of Computer Science Purdue University Range Acquisition Taxonomy Range acquisition Contact Transmissive Mechanical (CMM, jointed arm)

More information

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Mobile Motion: Multimodal Device Augmentation for Musical Applications Mobile Motion: Multimodal Device Augmentation for Musical Applications School of Computing, School of Electronic and Electrical Engineering and School of Music ICSRiM, University of Leeds, United Kingdom

More information

PRESENTED BY HUMANOID IIT KANPUR

PRESENTED BY HUMANOID IIT KANPUR SENSORS & ACTUATORS Robotics Club (Science and Technology Council, IITK) PRESENTED BY HUMANOID IIT KANPUR October 11th, 2017 WHAT ARE WE GOING TO LEARN!! COMPARISON between Transducers Sensors And Actuators.

More information

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive

More information

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement.

Contents. Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement. Kazutaka Mitobe Akita University, Japan Contents Magnetic Motion Capture System Application for Posture Measurement Application for Dexterous Finger Measurement Hand MoCap Learning system Conclusions 1

More information

Aural and Haptic Displays

Aural and Haptic Displays Teil 5: Aural and Haptic Displays Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Aural Displays Haptic Displays Further information: The Haptics Community Web Site: http://haptic.mech.northwestern.edu/

More information

Avatar gesture library details

Avatar gesture library details APPENDIX B Avatar gesture library details This appendix provides details about the format and creation of the avatar gesture library. It consists of the following three sections: Performance capture system

More information

Research Seminar. Stefano CARRINO fr.ch

Research Seminar. Stefano CARRINO  fr.ch Research Seminar Stefano CARRINO stefano.carrino@hefr.ch http://aramis.project.eia- fr.ch 26.03.2010 - based interaction Characterization Recognition Typical approach Design challenges, advantages, drawbacks

More information

Falsework & Formwork Visualisation Software

Falsework & Formwork Visualisation Software User Guide Falsework & Formwork Visualisation Software The launch of cements our position as leaders in the use of visualisation technology to benefit our customers and clients. Our award winning, innovative

More information

Virtual Reality & Interaction

Virtual Reality & Interaction Virtual Reality & Interaction Virtual Reality Input Devices Output Devices Augmented Reality Applications What is Virtual Reality? narrow: immersive environment with head tracking, headmounted display,

More information

The Control of Avatar Motion Using Hand Gesture

The Control of Avatar Motion Using Hand Gesture The Control of Avatar Motion Using Hand Gesture ChanSu Lee, SangWon Ghyme, ChanJong Park Human Computing Dept. VR Team Electronics and Telecommunications Research Institute 305-350, 161 Kajang-dong, Yusong-gu,

More information

Hanuman KMUTT: Team Description Paper

Hanuman KMUTT: Team Description Paper Hanuman KMUTT: Team Description Paper Wisanu Jutharee, Sathit Wanitchaikit, Boonlert Maneechai, Natthapong Kaewlek, Thanniti Khunnithiwarawat, Pongsakorn Polchankajorn, Nakarin Suppakun, Narongsak Tirasuntarakul,

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Tracking Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Outline of this talk Introduction: what makes a good tracking system? Example hardware and their tradeoffs Taxonomy of tasks:

More information

Spatial Tracking Basics

Spatial Tracking Basics COMS W4172 3D Tracking Technologies (Non-Optical) Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 April 24, 2018 1 Spatial

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS THE PINNACLE OF VIRTUAL REALITY CONTROLLERS PRODUCT INFORMATION The Manus VR Glove is a high-end data glove that brings intuitive interaction to virtual reality. Its unique design and cutting edge technology

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

History of Virtual Reality. Trends & Milestones

History of Virtual Reality. Trends & Milestones History of Virtual Reality (based on a talk by Greg Welch) Trends & Milestones Displays (head-mounted) video only, CG overlay, CG only, mixed video CRT vs. LCD Tracking magnetic, mechanical, ultrasonic,

More information

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, 2012 10.5682/2066-026X-12-103 DEVELOPMENT OF A NATURAL USER INTERFACE FOR INTUITIVE PRESENTATIONS

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM Annals of the University of Petroşani, Mechanical Engineering, 8 (2006), 73-78 73 VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM JOZEF NOVÁK-MARCINČIN 1, PETER BRÁZDA 2 Abstract: Paper describes

More information

CHAPTER 1. INTRODUCTION 16

CHAPTER 1. INTRODUCTION 16 1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact

More information

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology Robot Sensors 2.12 Introduction to Robotics Lecture Handout September 20, 2004 H. Harry Asada Massachusetts Institute of Technology Touch Sensor CCD Camera Vision System Ultrasonic Sensor Photo removed

More information

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity and acceleration sensing Force sensing Vision based

More information

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017 TOUCH & FEEL VIRTUAL REALITY DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS MotionCore, the smallest size AHRS in the world, is an ultra-small form factor, highly accurate inertia system based

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY IADIS International Conference Gaming 2008 THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY Yang-Wai Chow School of Computer Science and Software Engineering

More information

Tracking in Unprepared Environments for Augmented Reality Systems

Tracking in Unprepared Environments for Augmented Reality Systems Tracking in Unprepared Environments for Augmented Reality Systems Ronald Azuma HRL Laboratories 3011 Malibu Canyon Road, MS RL96 Malibu, CA 90265-4799, USA azuma@hrl.com Jong Weon Lee, Bolan Jiang, Jun

More information

-6- lllllllllllllllll. (12) United States Patent Foxlin. (io) Patent No.: US 6,757,068 B2 (45) Date of Patent: Jun. 29,2004 US B2

-6- lllllllllllllllll. (12) United States Patent Foxlin. (io) Patent No.: US 6,757,068 B2 (45) Date of Patent: Jun. 29,2004 US B2 (12) United States Patent Foxlin lllllllllllllllll US006757068B2 (io) Patent No.: (45) Date of Patent: Jun. 29,2004 (54) SELF-REFERENCED TRACKING (75) Inventor: Eric Foxlin, Arlington, MA (US) (73) Assignee:

More information

Haptic interaction. Ruth Aylett

Haptic interaction. Ruth Aylett Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration

More information

VE Input Devices. Doug Bowman Virginia Tech

VE Input Devices. Doug Bowman Virginia Tech VE Input Devices Doug Bowman Virginia Tech Goals and Motivation Provide practical introduction to the input devices used in VEs Examine common and state of the art input devices look for general trends

More information

Virtual Reality and Natural Interactions

Virtual Reality and Natural Interactions Virtual Reality and Natural Interactions Jackson Rushing Game Development and Entrepreneurship Faculty of Business and Information Technology j@jacksonrushing.com 2/23/2018 Introduction Virtual Reality

More information

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University

More information

Indoor Positioning by the Fusion of Wireless Metrics and Sensors

Indoor Positioning by the Fusion of Wireless Metrics and Sensors Indoor Positioning by the Fusion of Wireless Metrics and Sensors Asst. Prof. Dr. Özgür TAMER Dokuz Eylül University Electrical and Electronics Eng. Dept Indoor Positioning Indoor positioning systems (IPS)

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

PRODUCTS DOSSIER.  / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1 PRODUCTS DOSSIER DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es / hello@neurodigital.es Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor

More information

NavShoe Pedestrian Inertial Navigation Technology Brief

NavShoe Pedestrian Inertial Navigation Technology Brief NavShoe Pedestrian Inertial Navigation Technology Brief Eric Foxlin Aug. 8, 2006 WPI Workshop on Precision Indoor Personnel Location and Tracking for Emergency Responders The Problem GPS doesn t work indoors

More information

CS 393R. Lab Introduction. Todd Hester

CS 393R. Lab Introduction. Todd Hester CS 393R Lab Introduction Todd Hester todd@cs.utexas.edu Outline The Lab: ENS 19N Website Software: Tekkotsu Robots: Aibo ERS-7 M3 Assignment 1 Lab Rules My information Office hours Wednesday 11-noon ENS

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection http://dx.doi.org/10.3991/ijim.v10i3.5552 Herman Tolle 1 and Kohei Arai 2 1 Brawijaya

More information

Introduction to Embedded Systems

Introduction to Embedded Systems Introduction to Embedded Systems Edward A. Lee & Sanjit Seshia UC Berkeley EECS 124 Spring 2008 Copyright 2008, Edward A. Lee & Sanjit Seshia, All rights reserved Lecture 3: Sensors and Actuators Sensors

More information

Intelligent Robotics Sensors and Actuators

Intelligent Robotics Sensors and Actuators Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Page ENSC387 - Introduction to Electro-Mechanical Sensors and Actuators: Simon Fraser University Engineering Science

Page ENSC387 - Introduction to Electro-Mechanical Sensors and Actuators: Simon Fraser University Engineering Science Motor Driver and Feedback Control: The feedback control system of a dc motor typically consists of a microcontroller, which provides drive commands (rotation and direction) to the driver. The driver is

More information

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

HAND GESTURE CONTROLLED ROBOT USING ARDUINO HAND GESTURE CONTROLLED ROBOT USING ARDUINO Vrushab Sakpal 1, Omkar Patil 2, Sagar Bhagat 3, Badar Shaikh 4, Prof.Poonam Patil 5 1,2,3,4,5 Department of Instrumentation Bharati Vidyapeeth C.O.E,Kharghar,Navi

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Comparison of Head Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Nehemia Sugianto 1 and Elizabeth Irenne Yuwono 2 Ciputra University, Indonesia 1 nsugianto@ciputra.ac.id

More information

INERTIAL LABS SUBMINIATURE 3D ORIENTATION SENSOR OS3DM

INERTIAL LABS SUBMINIATURE 3D ORIENTATION SENSOR OS3DM Datasheet Rev..5 INERTIAL LABS SUBMINIATURE D ORIENTATION SENSOR TM Inertial Labs, Inc Address: 9959 Catoctin Ridge Street, Paeonian Springs, VA 2029 U.S.A. Tel: + (70) 880-4222, Fax: + (70) 95-877 Website:

More information

JEPPIAAR ENGINEERING COLLEGE

JEPPIAAR ENGINEERING COLLEGE JEPPIAAR ENGINEERING COLLEGE Jeppiaar Nagar, Rajiv Gandhi Salai 600 119 DEPARTMENT OFMECHANICAL ENGINEERING QUESTION BANK VII SEMESTER ME6010 ROBOTICS Regulation 013 JEPPIAAR ENGINEERING COLLEGE Jeppiaar

More information

UUIs Ubiquitous User Interfaces

UUIs Ubiquitous User Interfaces UUIs Ubiquitous User Interfaces Alexander Nelson April 16th, 2018 University of Arkansas - Department of Computer Science and Computer Engineering The Problem As more and more computation is woven into

More information

Physical Presence in Virtual Worlds using PhysX

Physical Presence in Virtual Worlds using PhysX Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are

More information

AN EXPLORATION OF UNMANNED AERIAL VEHICLE DIRECT MANIPULATION THROUGH 3D SPATIAL INTERACTION. KEVIN PFEIL B.S. University of Central Florida, 2010

AN EXPLORATION OF UNMANNED AERIAL VEHICLE DIRECT MANIPULATION THROUGH 3D SPATIAL INTERACTION. KEVIN PFEIL B.S. University of Central Florida, 2010 AN EXPLORATION OF UNMANNED AERIAL VEHICLE DIRECT MANIPULATION THROUGH 3D SPATIAL INTERACTION by KEVIN PFEIL B.S. University of Central Florida, 2010 A thesis submitted in partial fulfilment of the requirements

More information

Sensor Portfolio for Machinery Health Applications

Sensor Portfolio for Machinery Health Applications Machinery Health Management Product Data Sheet March 2014 Sensor Portfolio for Machinery Health Applications One source of responsibility for the entire measurement chain measurements Unique sensors co-developed

More information