Hand Tracking and Visualization in a Virtual Reality Simulation

Size: px
Start display at page:

Download "Hand Tracking and Visualization in a Virtual Reality Simulation"

Transcription

1 FridayPM1SystemsA&D.2 Hand Tracking and Visualization in a Virtual Reality Simulation Charles R. Cameron, Louis W. DiValentin, Rohini Manaktala, Adam C. McElhaney, Christopher H. Nostrand, Owen J. Quinlan, Lauren N. Sharpe, Adam C. Slagle, Charles D. Wood, Yang Yang Zheng, and Gregory J. Gerling, Member, IEEE Abstract- Tracking a user s hand for 3D rendering and visualization creates a sense of presence in virtual reality environments. At present, tracking devices built for both research and consumer use are increasingly accessible, with ever improving spatiotemporal accuracy. This work seeks to contribute a new design which offers an ability to track the fingers and palm. The hand tracking method described herein ties absolute tracking of the user s palm with relative tracking of individual fingers. A virtually rendered image of the movements of the user s hand is displayed in near real-time to a virtual environment developed with the X3D ISO standard for representing 3D graphics. The tracking system was verified using experiments designed to confirm the accuracy and usability of the device. Experiment 1 tested the absolute positioning capability by tracking the subject s palm. Experiment 2 tested the relative positioning capability by tracking the subject s fingers. The results of the experiments indicate that the tracking component of the virtual reality system is able to accurately detect subjects interaction with objects in the virtual environment. C I. MOTIVATION ENTRAL to the successful impression of user realism in a virtual reality environment is the extent to which a user feels immersed in the experience. Immersion is a central topic of simulation design that has been studied extensively [1-3]. A significant factor in immersion is the accurate and timely tracking and representation of a user s hand in the virtual environment. If the user can see a virtual rendering of his or her hand and its movements relative to the movements of other objects, there is a much better chance that the user will feel that the virtual hand embodies his intentions. Currently, several means of tracking the hand and body are found in the market, such as the CyberGlove II, 5DT Data Glove Ultra, and Microsoft Kinect, though each is Manuscript received April 4, C. R. Cameron, A.C. McElhaney, and A.C. Slagle are fourth year students in Mechanical Engineering at the University of Virginia, Charlottesville, ( crc9s@virginia.edu, acm9g, acs9p). L. W. DiValentin, O. J. Quinlan, L. N. Sharpe, and C. D. Wood are fourth year students in Systems and Information Engineering at the University of Virginia, Charlottesville, ( lwd5d@virginia.edu, ojq5a, lns3c, cdw4j). R. Manaktala is a fourth year student in Biomedical Engineering at the University of Virginia, Charlottesville, ( rm9xe@virginia.edu). C. H. Nostrand and Y. Y. Zheng are fourth year students in Computer Science (CS) at the University of Virginia, Charlottesville, ( chn8sh@virginia.edu, yyz5w). G. J. Gerling is an assistant professor in SIE at the University of Virginia, Charlottesville, VA ( gg7th@virginia.edu). limited in scope of usage to one extent or another, several are quite costly, and others are difficult to integrate with virtual environments with graphics and haptics rendering engines. A low-cost, adaptable, and accurate tracking solution will lead to a more intuitive and effective virtual environment. II. BACKGROUND The field of motion tracking has recently witnessed increased attention. Several companies have developed products used by CGI animators, video game developers, and sign language specialists to accurately track the position of a user s hand or body. Devices vary in capability to track increasing degrees of freedom and to track with higher spatial and temporal accuracy, from gloves with one sensor per finger which measure finger flexure [4], to those with 40 covering the entire hand [5]. Acceleglove, for instance, is a glove which tracks acceleration of the individual fingers. It uses six 3-axis micro electro-mechanical systems (MEMS) accelerometers to capture relative position of the palm and each of the fingertips [6]. These commercial products can provide a great deal of accuracy. The CyberGlove II, for instance, has flexure sensors which are accurate to within 1 degree [7]. While existing systems are adept at tracking the position of the fingers, they fall short in their ability to determine the absolute position of the hand. However, individual components which could be combined to develop a more accurate hand tracking system currently exist. Flock of Birds is an absolute tracking system that uses a magnetic field to track a sensor with six degrees of freedom (X, Y, Z, pitch, roll, yaw) [8]. Specifically in the video game industry, motion tracking has been introduced in new peripheral input devices. Each of the major console developers (Microsoft, Nintendo, and Sony) have created a device (Kinect, Wii, and PlayStation Move, respectively) which allow the player to move a controller or their body and have the game respond appropriately. The Wii and PlayStation Move both only have the ability to track a single point within a small field (3-8 feet) [9], limiting their applicability to more complex tasks. The Kinect, which uses 3D motion imaging [10], has the ability to track multiple degrees of freedom of a user s entire body without an additional controller, which could greatly increase the number and complexity of tasks that can be tracked with similar hardware. Currently, these devices cannot track the hand and fingers to the spatial accuracy desired at the level of individual fingers when the user is situated close to the display, and others show significant temporal delay /11/$ IEEE 127 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 1

2 Research demonstrates the need for real-time performance coupled with acceptable realism and presence in a simulation [11]. However, often an increase in realism of a simulation necessitates a decrease in performance time (and vice-versa) due to limited computing power. By using less-complicated, but still sufficiently accurate, tracking devices and techniques, it should be possible to maintain acceptable levels of presence, realism, and performance time. III. HIGH LEVEL DESCRIPTION In order to develop a simulator, two main components had to be created. One is a glove that can both track the position of the user s hand in real time and provide feedback to the user when an event occurs in the simulation. The other is the virtual environment in which the user s hand is displayed, and tasks can be assigned and carried out. Equally important to the system is the flow of data between the glove and the environment. Since various components of the system have different capabilities and limitations, a data transfer system is required to integrate the many sources of data and combine them to work cooperatively in the virtual environment. The interaction between the user and the virtual environment also requires accurate collision detection. Having rapid data transfer, response time, and refresh rates provides the user with realistic interactions in the virtual environment. These interactions must occur in real time to provide the user with the sense that the simulation is real; otherwise the user will not be sufficiently immersed in the environment. The virtual environment (VE) was developed using the H3D API, and implements low level data communication using Python and C#. By detecting when collisions have occurred within the VE, the user is provided with the appropriate force response. B. Devices 1) Flock of Birds Flock of Birds (FOB) (Ascension Technology Corporation) is a six-degree-of-freedom motion tracking system that measures the position and orientation of sensors through a transmitter. It can simultaneously track and measure all of its sensors by transmitting a pulsed direct current magnetic field. When a sensor is within ±1.22m of its transmitter, the sensor can make from 22 to 144 measurements per second of its current position and orientation [12]. It also has the angular range of ±180 degrees Azimuth and Roll and ±90 degrees elevation. In addition, it has the static positional accuracy of 2.13cm root mean square (RMS) and the static angular accuracy of 0.5 degree RMS. It has the positional resolution of 9.14cm at 3.66m and the angular resolution of 0.1 degree RMS at 3.66m. Data output from a transmitter may be delayed up to 2 milliseconds. FOB provides the values for orientation and rotation in several different formats, including Euler angles, rotation matrices, and quaternions. It can be configured to use a single transmitter and sensor or a complex combination of transmitters and sensors, making it fit well with different types of applications. IV. REQUIREMENTS A number of requirements were developed to ensure a tracking and visualization system that is an effective component of a virtual reality simulation. 1. The device shall have absolute tracking a. The device shall have absolute tracking within a 3 meter boundary radius b. The device shall be accurate to within 1 cm c. The device shall track in three dimensions 2. The device shall have rotational tracking a. The device shall show the rotation of the yaw b. The device shall show the rotation of the pitch c. The device shall show the rotation of the roll d. The device shall rotate 360 degrees 3. The device shall have relative tracking of the fingers with 180 degrees of motion V. METHODS A. Overview To track the 3D position of the fingers and palm of the hand, an instrumented glove was developed containing multiple devices. These include the absolute tracking device Flock of Birds and the finger flexure measuring Flex Sensors. Fig. 1. (a) Flock of Birds. (b) The box on the left of the monitor is the transmitter, and the device placed on top of the palm is the sensor /11/$ IEEE 128 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 2

3 We use FOB with a single transmitter and sensor to constantly track the current position and orientation of the glove so that it can be properly simulated and displayed in the virtual environment (VE). We use a C++ program to extract the values of the sensor s position and orientation from the FOB. Because the coordinate system of the FOB and our VE is different, we need to find a way to convert the position and rotation values from the FOB to values that reflect the VE s coordinate system. For the position values, we need to swap their axes and negate and scale them. For rotation values, we need to obtain the rotation matrix from the FOB and use it to switch axes. 2) Flex Sensors Flex sensors (Spectra Symbol, Flex Sensor 11.43cm (SEN-08606), Salt Lake City, UT) monitor finger flexion. These sensors are only 0.64cm wide and 11.43cm long, allowing them to lie directly along the finger unobstructed. As the sensors are flexed, the resistance across the sensor increases. This allows the recording of the rate the increase in resistance to the bending angle. By measuring only one axis of bending, the sensors are very dependable and are less prone to noise than accelerometers. Inside the flex sensors are carbon resistive elements in a thin flexible substrate. When the substrate is bent the sensor produces a resistance directly related to the bend radius. In a prototype glove, 5 flex sensors, one for each finger of a hand, are wired as variable analog voltage dividers on an Arduino microcontroller circuit board (ATMega1280). The Arduino then reads a variable voltage from 0-5 Volts based on the bending of the finger. This data, represented in a fraction of a byte, is then scaled to the proximal flexion angles of the human fingers. The Arduino then sends a byte across the serial port where an X3D program deciphers the byte and moves the hand to the correct position. The proper wiring schematic for the flex sensors are represented in Fig. 2. Using Autodesk Inventor, several rings were designed with a slit to hold the flex sensors in place along the hand. A rapid prototyping system (uprint) was used to produce plastic models to hold 5 different flex sensors along each finger of the hand. The user s actions of finger flexion and extension refer to bending towards the palm, while finger abduction and adduction refer to the fingers bending towards each other. The disadvantage of the flex sensors is that they are only able to measure flexion and extension angles. Abduction cannot be measured because the sensors can only bend along one axis. The flex sensor has a life cycle of over 1 million cycles, can operate in the temperature range of -35 degrees C to 80 degrees C, and experience resistances from 60,000 to 110,000 Ohms. The operating range typically occurs within a 90 degree bend. Therefore, the Arduino readings will be scaled to fit the optimal bending of the hand. One other disadvantage of flex sensors is that they can only measure the fingers bend in unison with all of the joints; it cannot portray the independent bending of a single joint. Regardless, typical grasping movements involve a natural bending arc and the motion capture capabilities of the Flex Sensors is sufficient to display this. Fig. 2. Electrical diagram of Arduino circuit board (left) and 5 flex sensors (right) connected to analog input lines. C. Virtual Reality Visualization Haptic software development platforms allow users to create applications that can display 3D figures in a virtual environment and interact with users through force feedback. Some of these platforms that are available in the market are OpenHaptics, GHOST, CHAI 3D, and MHaptic. To develop our simulation, we used H3DAPI, an open source haptics platform. This platform renders 3D figures using OpenGL and X3D. OpenGL and X3D are two widely-used graphic standards for producing computer graphics. H3DAPI can support many different haptics devices that are available in the market, such as the SensAble PHANTOM, Force Dimension Omega, and Novint Falcon, because it is device independent [13]. It also works well with custom made devices. Its device independency is one of main reasons we use it instead of other haptics software platforms. Fig. 3. A model of the hand in the virtual environment /11/$ IEEE 129 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 3

4 Using H3DAPI, we created a model of the hand by grouping a box with fourteen cylinders (see Fig. 3). Each individual cylinder represents a segment of the finger. A group of two to three fingers represent a single finger. The palm is represented by a box with a height of 6.35cm, width of 6.35cm, and thickness of 1.59cm. All cylinders have a radius of 0.73cm. Their heights vary from 2.06cm to 3.18cm. Several factors are necessary to simulate a moving hand. First, the palm and the fingers need to move at exactly the same time. They also should never be disjointed. The bending of each finger is simulated as rotations of each cylinder and group of cylinders, and the point of rotation is the bottom of each cylinder. H3DAPI is written in the C++ programming language. In addition to C++, programmers can also use the X3D modeling language and Python scripting language to develop new software applications using this platform. We use X3D and Python to program the graphics component. We use X3D to create basic 3D shapes such as the box and the cylinders. H3DAPI represents each of these shapes as a node. Furthermore, we use X3D to define their colors, sizes, positions, and rotations. We use Python to process different data in the VE. The written Python script communicates with various devices through User Datagram Protocol (UDP) sockets. It also performs all of the calculations needed to map the values received from these devices to the values that display 3D figures in the VE. H3DAPI also represents this Python script as a node in the X3D file. This node is linked with the figures in the VE to affect their attributes such as position and rotation to animate them. D. Data Communication The data transfer system used for tracking hand movements to generate an accurate visualization utilizes a set of two datagram sockets with associated input control programs, as shown in Fig. 4. Sockets present a means of data transfer between multiple languages and systems. A UDP Datagram Socket Protocol can be used to stream data between programs not written in the same language. This works by designating a port number and passing data into an abstract address in a computer's memory. A listener in another program can execute an action whenever a new value is pushed to that address, effectively allowing programs of different languages to communicate. Tracking data is received from two sources on the glove, and turned into movement on a screen. The system must then also recognize when a collision has occurred in the virtual environment and send the appropriate feedback to the glove. For the Flock of Birds, a C++ control program uses custom input methods to read from the serial port and convert the input into six values. The program interprets these values as three Cartesian coordinates and three Euler angles and concatenates them into a single string of values. This string is delivered to a datagram socket every time the values register on the serial port. For the Flex Sensors, an Arduino microprocessor has been programmed to map the changes in resistance over the sensors into a set of integers valued from 0 to 255. The Arduino then concatenates these integers into a string and sends it to the serial port. A C# program delivers the string to a datagram socket every time the Arduino sends a set of values. The simulation then uses the values from both ports to set the position, orientation, and Fig.4. Data transfer system /11/$ IEEE 130 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 4

5 E. Collision Detection Application of feedback at the appropriate time is essential to the user s sense of immersion in the virtual environment. To give this feedback, the system must detect when a user contacts an object in the virtual environment. Important factors for the implementation of collision detection are the spatial accuracy of the virtual in the environment, the temporal accuracy of the interaction between objects (the speed at which a collision can be detected), and the speed at which feedback can be delivered to the glove. Two alternative collision detection methods are the creation of a custom algorithm or the importation of an existing library. The trade-off between the two is versatility over convenience. Using a custom algorithm allows full access to the algorithm implementation and rapid customization to specific tasks. However, a custom algorithm adds programming code complexity and uncertainty unless the algorithm is appropriately validated. On the other hand, use of existing libraries is convenient but it restricts the functionality of collision detection to that library. Because of the restrictions of the existing libraries, the tracking system that was developed necessitated the use of a custom algorithm. The algorithm that was developed for collision detection only has to be functional for simple geometric shapes. It uses the absolute positions of specific points on an object (e.g., the center) and its dimensions to calculate the distance between two objects. In the case of collision detection between two spheres, the equation for collision is A. Setup 1) Experiment #1: Absolute Tracking Test Fig. 5. Screen shot of experimental setup #1. As shown in Fig. 5, the setup consists of twelve identically sized spheres (radius = inches) labeled A-L. Each subject was asked to complete a pattern by virtually touching the labeled spheres in a specific order. The order used for the experiment was A, L, K, G, E, F, C, D, G, F, B, K, J, I, C, D, E, H, I, B, K, F, L, K, H, B, A, C. 2) Experiment #2: Relative Tracking Test (1) where S1 and S2 are the position vectors of each sphere and r1 and r2 are the radii of the spheres. If the distance between the objects is less than the objects boundaries (e.g., if the distance between two spheres is less than the sum of their radii) then the two objects have collided. VI. EXPERIMENTS The degree to which a user feels immersed in a virtual reality simulation depends on the accuracy and timing of tracking, rendering and representation of the user s hand and fingers, and the detection of collisions in the virtual environment. The first experiment tested the FOB s ability to track the absolute position of the subject s palm. The second experiment tested the bend sensors ability to track the relative position of the subject s fingers. These experiments also tested the subjects interaction accuracy, ease of manipulation and agility in the virtual environment. The primary goal of these experiments was to test the tracking, visualization and collision detection capabilities of the virtual reality system. Fig. 6. Screen shot of experimental setup #2. As shown in Fig. 6, the hand positioned in the black background is the virtual representation of the subject s hand. The nine finger positions were randomly selected one at a time and required the subject to recreate them. B. Procedures For the first experiment, the subjects followed a designated pattern by moving their virtual hand to different lettered spheres until they completed the pattern. The time it took for the subjects to complete this task was recorded. For the second experiment, the subjects were told to mimic particular finger positions. The time it took for the subject to complete each finger position and for it to be visualized in the virtual environment was recorded. C. Participants There were five male subjects between the ages of years of age attending UVa that participated in the experiments. All of them were right hand dominant. They were also not familiar with the tracking, visualization and collision detection capabilities used in the experiments /11/$ IEEE 131 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 5

6 D. Results For the first experiment, the average task completion time was ± 6.35 seconds. For the second experiment, the average task completion time was ± 7.89 seconds. Based on the experimental results, four conclusions can be made: 1) Visual representation of the subject s hand and fingers in the virtual environment is realistically depicted 2) FOB is able to accurately detect the absolute position of the subject s hand, 3) Flex Sensors are precisely able to track the relative position of the subject s fingers 4) Collision detection is functional in both experiments contributing to the subject s percieved sense of reality in the simulations Project Natal for Xbox 360: Groundbreaking optical-sensing and recognition technologies to aid gesture control platform., Mar [11] J. Seo and G.J. Kim, Design for presence: a structured approach to virtual reality system design, Presence: Teleoperators & Virtual Environments, vol. 11, 2002, pp [12] Ascension Technology Corporation, Flock of Birds: Installation and Operation Guide, [13] SenseGraphics, H3DAPI: The Open Source Platform for Multimodal Development. VII. DISCUSSION While the system is currently operational, albeit in a rudimentary form, the task of accurately tracking the hand has been accomplished through the use of various devices. In the future, this tracking and displaying capability will become a key feature in the development of simulations. This technology can be applied in a variety of fields including medical simulations. In the case of medicine, accurate tracking of each of the fingers and hand are critical to the user s sense of immersion, and therefore the success of the simulation as a whole and its use as a learning tool. REFERENCES [1] R. Pausch, D. Proffitt, and G. Williams, Quantifying immersion in virtual reality, Proceedings of the 24th annual conference on Computer graphics and interactive techniques, 1997, pp [2] F.P. Brooks, What's real about virtual reality?, IEEE Computer Graphics and Applications, 1999, pp [3] K. Mania and A. Chalmers, The effects of levels of immersion on memory and presence in virtual environments: A reality centered approach, CyberPsychology & Behavior, vol. 4, 2001, pp [4] 5DT Data Glove 5 Ultra, Mar Available: [5] ShapeHand, 2009.Available: [6] AcceleGlove FAQs, Available: [7] CyberGlove II Specifications, Available: ve-ii/specifications [8] Flock of Birds. Available: [9] Nintendo- Wii Sensor Bar Settings, Available: /settingssensorbar.jsp [10] PrimeSense Supplies 3-D-Sensing Technology to /11/$ IEEE 132 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 6

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

Overview of current developments in haptic APIs

Overview of current developments in haptic APIs Central European Seminar on Computer Graphics for students, 2011 AUTHOR: Petr Kadleček SUPERVISOR: Petr Kmoch Overview of current developments in haptic APIs Presentation Haptics Haptic programming Haptic

More information

Virtual Grasping Using a Data Glove

Virtual Grasping Using a Data Glove Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct

More information

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2 CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

DATA GLOVES USING VIRTUAL REALITY

DATA GLOVES USING VIRTUAL REALITY DATA GLOVES USING VIRTUAL REALITY Raghavendra S.N 1 1 Assistant Professor, Information science and engineering, sri venkateshwara college of engineering, Bangalore, raghavendraewit@gmail.com ABSTRACT This

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements Jose Fortín and Raúl Suárez Abstract Software development in robotics is a complex task due to the existing

More information

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF

More information

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering A Step Forward in Virtual Reality Team Step Ryan Daly Electrical Engineer Jared Ricci Electrical Engineer Joseph Roberts Electrical Engineer Steven So Electrical Engineer 2 Motivation Current Virtual Reality

More information

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA

More information

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar WiCon Robo Hand Team Members: Mouhyemen Khan Arian Yusuf Ahmed Ragheeb Nouran Mohamed Team Name: N-ARM Electrical & Computer Engineering Department, Texas A&M University at Qatar Submitted to Dr. Haitham

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Modeling and Experimental Studies of a Novel 6DOF Haptic Device Proceedings of The Canadian Society for Mechanical Engineering Forum 2010 CSME FORUM 2010 June 7-9, 2010, Victoria, British Columbia, Canada Modeling and Experimental Studies of a Novel DOF Haptic Device

More information

Haptic Feedback in Mixed-Reality Environment

Haptic Feedback in Mixed-Reality Environment The Visual Computer manuscript No. (will be inserted by the editor) Haptic Feedback in Mixed-Reality Environment Renaud Ott, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory (VRLab) École Polytechnique

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Peter Berkelman. ACHI/DigitalWorld

Peter Berkelman. ACHI/DigitalWorld Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash

More information

FORCE FEEDBACK. Roope Raisamo

FORCE FEEDBACK. Roope Raisamo FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces

More information

Gesture Recognition with Real World Environment using Kinect: A Review

Gesture Recognition with Real World Environment using Kinect: A Review Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,

More information

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality GESTUR Sensing & Feedback Glove for interfacing with Virtual Reality Initial Design Review ECE 189A, Fall 2016 University of California, Santa Barbara History & Introduction - Oculus and Vive are great

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD DEVELOPER @ SENSE GLOVE Current Interactions in VR Input Device Virtual Hand Model (VHM) Sense Glove Accuracy (per category) Optics based

More information

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space Morteza Ghazisaedy David Adamczyk Daniel J. Sandin Robert V. Kenyon Thomas A. DeFanti Electronic Visualization Laboratory (EVL) Department

More information

Extended Kalman Filtering

Extended Kalman Filtering Extended Kalman Filtering Andre Cornman, Darren Mei Stanford EE 267, Virtual Reality, Course Report, Instructors: Gordon Wetzstein and Robert Konrad Abstract When working with virtual reality, one of the

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

CSCI 599 Physically Based Modeling for Interactive Simulation and Games Topic: Haptics

CSCI 599 Physically Based Modeling for Interactive Simulation and Games Topic: Haptics CSCI 599 Physically Based Modeling for Interactive Simulation and Games Topic: Haptics Mathematica It is a powerful tool. It can be used to check if the code works correct. Simple usage by example: 1.

More information

FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT

FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT Jagtap Gautami 1, Alve Sampada 2, Malhotra Sahil 3, Pankaj Dadhich 4 Electronics and Telecommunication Department, Guru Gobind Singh Polytechnic, Nashik

More information

KINECT CONTROLLED HUMANOID AND HELICOPTER

KINECT CONTROLLED HUMANOID AND HELICOPTER KINECT CONTROLLED HUMANOID AND HELICOPTER Muffakham Jah College of Engineering & Technology Presented by : MOHAMMED KHAJA ILIAS PASHA ZESHAN ABDUL MAJEED AZMI SYED ABRAR MOHAMMED ISHRAQ SARID MOHAMMED

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

BW-IMU200 Serials. Low-cost Inertial Measurement Unit. Technical Manual

BW-IMU200 Serials. Low-cost Inertial Measurement Unit. Technical Manual Serials Low-cost Inertial Measurement Unit Technical Manual Introduction As a low-cost inertial measurement sensor, the BW-IMU200 measures the attitude parameters of the motion carrier (roll angle, pitch

More information

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING K.Gopal, Dr.N.Suthanthira Vanitha, M.Jagadeeshraja, and L.Manivannan, Knowledge Institute of Technology Abstract: - The advancement

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

AUTONOMOUS MOTION CONTROLLED HAND-ARM ROBOTIC SYSTEM

AUTONOMOUS MOTION CONTROLLED HAND-ARM ROBOTIC SYSTEM Autonomous Motion Controlled Hand-Arm Robotic System AUTONOMOUS MOTION CONTROLLED HAND-ARM ROBOTIC SYSTEM NIJI JOHNSON AND P.SIVASANKAR RAJAMANI KSR College of Engineering,Thiruchengode-637215 Abstract:

More information

CS 354R: Computer Game Technology

CS 354R: Computer Game Technology CS 354R: Computer Game Technology http://www.cs.utexas.edu/~theshark/courses/cs354r/ Fall 2017 Instructor and TAs Instructor: Sarah Abraham theshark@cs.utexas.edu GDC 5.420 Office Hours: MW4:00-6:00pm

More information

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

CS277 - Experimental Haptics Lecture 2. Haptic Rendering CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...

More information

Development of K-Touch TM Haptic API for Various Datasets

Development of K-Touch TM Haptic API for Various Datasets Development of K-Touch TM Haptic API for Various Datasets Beom-Chan Lee 1 Jong-Phil Kim 2 Jongeun Cha 3 Jeha Ryu 4 ABSTRACT This paper presents development of a new haptic API (Application Programming

More information

Initial Project and Group Identification Document September 15, Sense Glove. Now you really do have the power in your hands!

Initial Project and Group Identification Document September 15, Sense Glove. Now you really do have the power in your hands! Initial Project and Group Identification Document September 15, 2015 Sense Glove Now you really do have the power in your hands! Department of Electrical Engineering and Computer Science University of

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

PRODUCTS DOSSIER.  / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1 PRODUCTS DOSSIER DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es / hello@neurodigital.es Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor

More information

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering A Step Forward in Virtual Reality Team Step Ryan Daly Electrical Engineer Jared Ricci Electrical Engineer Joseph Roberts Electrical Engineer Steven So Electrical Engineer 2 Motivation Current Virtual Reality

More information

A TECHNIQUE TO EVALUATE THE IMPACT OF FLEX CABLE PHASE INSTABILITY ON mm-wave PLANAR NEAR-FIELD MEASUREMENT ACCURACIES

A TECHNIQUE TO EVALUATE THE IMPACT OF FLEX CABLE PHASE INSTABILITY ON mm-wave PLANAR NEAR-FIELD MEASUREMENT ACCURACIES A TECHNIQUE TO EVALUATE THE IMPACT OF FLEX CABLE PHASE INSTABILITY ON mm-wave PLANAR NEAR-FIELD MEASUREMENT ACCURACIES Daniël Janse van Rensburg Nearfield Systems Inc., 133 E, 223rd Street, Bldg. 524,

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Image Manipulation Interface using Depth-based Hand Gesture

Image Manipulation Interface using Depth-based Hand Gesture Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking

More information

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern ModaDJ Development and evaluation of a multimodal user interface Course Master of Computer Science Professor: Denis Lalanne Renato Corti1 Alina Petrescu2 1 Institute of Computer Science University of Bern

More information

EKT 314/4 LABORATORIES SHEET

EKT 314/4 LABORATORIES SHEET EKT 314/4 LABORATORIES SHEET WEEK DAY HOUR 4 1 2 PREPARED BY: EN. MUHAMAD ASMI BIN ROMLI EN. MOHD FISOL BIN OSMAN JULY 2009 Creating a Typical Measurement Application 5 This chapter introduces you to common

More information

Air Marshalling with the Kinect

Air Marshalling with the Kinect Air Marshalling with the Kinect Stephen Witherden, Senior Software Developer Beca Applied Technologies stephen.witherden@beca.com Abstract. The Kinect sensor from Microsoft presents a uniquely affordable

More information

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center Robotic System Simulation and ing Stefan Jörg Robotic and Mechatronic Center Outline Introduction The SAFROS Robotic System Simulator Robotic System ing Conclusions Folie 2 DLR s Mirosurge: A versatile

More information

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN PROGRAM OF STUDY ENGR.ROB Standard 1 Essential UNDERSTAND THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN The student will understand and implement the use of hand sketches and computer-aided drawing

More information

SELF STABILIZING PLATFORM

SELF STABILIZING PLATFORM SELF STABILIZING PLATFORM Shalaka Turalkar 1, Omkar Padvekar 2, Nikhil Chavan 3, Pritam Sawant 4 and Project Guide: Mr Prathamesh Indulkar 5. 1,2,3,4,5 Department of Electronics and Telecommunication,

More information

Localization (Position Estimation) Problem in WSN

Localization (Position Estimation) Problem in WSN Localization (Position Estimation) Problem in WSN [1] Convex Position Estimation in Wireless Sensor Networks by L. Doherty, K.S.J. Pister, and L.E. Ghaoui [2] Semidefinite Programming for Ad Hoc Wireless

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

CHAPTER 1. INTRODUCTION 16

CHAPTER 1. INTRODUCTION 16 1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact

More information

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1 VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio

More information

Haptic Feedback Technology

Haptic Feedback Technology Haptic Feedback Technology ECE480: Design Team 4 Application Note Michael Greene Abstract: With the daily interactions between humans and their surrounding technology growing exponentially, the development

More information

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion

More information

VR System Input & Tracking

VR System Input & Tracking Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring

More information

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures

More information

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: ,  Volume 2, Issue 11 (November 2012), PP 37-43 IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719, Volume 2, Issue 11 (November 2012), PP 37-43 Operative Precept of robotic arm expending Haptic Virtual System Arnab Das 1, Swagat

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML a a b Hyungjeen Choi, Jeha Ryu, and Chansu Lee a Human Machine Computer Interface Lab, Kwangju Institute of Science and Technology, Kwangju,

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Mobile Motion: Multimodal Device Augmentation for Musical Applications Mobile Motion: Multimodal Device Augmentation for Musical Applications School of Computing, School of Electronic and Electrical Engineering and School of Music ICSRiM, University of Leeds, United Kingdom

More information

Elements of Haptic Interfaces

Elements of Haptic Interfaces Elements of Haptic Interfaces Katherine J. Kuchenbecker Department of Mechanical Engineering and Applied Mechanics University of Pennsylvania kuchenbe@seas.upenn.edu Course Notes for MEAM 625, University

More information

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device Hung-Chi Chu 1, Yuan-Chin Cheng 1 1 Department of Information and Communication Engineering, Chaoyang University

More information

Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study

Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study Overview When developing and debugging I 2 C based hardware and software, it is extremely helpful

More information

2D Floor-Mapping Car

2D Floor-Mapping Car CDA 4630 Embedded Systems Final Report Group 4: Camilo Moreno, Ahmed Awada ------------------------------------------------------------------------------------------------------------------------------------------

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017 TOUCH & FEEL VIRTUAL REALITY DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor

More information

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical

More information

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1 DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor or greater Memory

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

3-Degrees of Freedom Robotic ARM Controller for Various Applications

3-Degrees of Freedom Robotic ARM Controller for Various Applications 3-Degrees of Freedom Robotic ARM Controller for Various Applications Mohd.Maqsood Ali M.Tech Student Department of Electronics and Instrumentation Engineering, VNR Vignana Jyothi Institute of Engineering

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive

More information

Cody Narber, M.S. Department of Computer Science, George Mason University

Cody Narber, M.S. Department of Computer Science, George Mason University Cody Narber, M.S. cnarber@gmu.edu Department of Computer Science, George Mason University Lynn Gerber, MD Professor, College of Health and Human Services Director, Center for the Study of Chronic Illness

More information

VR Haptic Interfaces for Teleoperation : an Evaluation Study

VR Haptic Interfaces for Teleoperation : an Evaluation Study VR Haptic Interfaces for Teleoperation : an Evaluation Study Renaud Ott, Mario Gutiérrez, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory Ecole Polytechnique Fédérale de Lausanne (EPFL) CH-1015

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Haptics-Augmented Physics Simulation: Coriolis Effect

Haptics-Augmented Physics Simulation: Coriolis Effect Haptics-Augmented Physics Simulation: Coriolis Effect Felix G. Hamza-Lup, Benjamin Page Computer Science and Information Technology Armstrong Atlantic State University Savannah, GA 31419, USA E-mail: felix.hamza-lup@armstrong.edu

More information

Learning Actions from Demonstration

Learning Actions from Demonstration Learning Actions from Demonstration Michael Tirtowidjojo, Matthew Frierson, Benjamin Singer, Palak Hirpara October 2, 2016 Abstract The goal of our project is twofold. First, we will design a controller

More information

Multi-Modal User Interaction

Multi-Modal User Interaction Multi-Modal User Interaction Lecture 4: Multiple Modalities Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk MMUI, IV, Zheng-Hua Tan 1 Outline Multimodal interface

More information

LDOR: Laser Directed Object Retrieving Robot. Final Report

LDOR: Laser Directed Object Retrieving Robot. Final Report University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory LDOR: Laser Directed Object Retrieving Robot Final Report 4/22/08 Mike Arms TA: Mike

More information

Development of excavator training simulator using leap motion controller

Development of excavator training simulator using leap motion controller Journal of Physics: Conference Series PAPER OPEN ACCESS Development of excavator training simulator using leap motion controller To cite this article: F Fahmi et al 2018 J. Phys.: Conf. Ser. 978 012034

More information

WHITE PAPER Need for Gesture Recognition. April 2014

WHITE PAPER Need for Gesture Recognition. April 2014 WHITE PAPER Need for Gesture Recognition April 2014 TABLE OF CONTENTS Abstract... 3 What is Gesture Recognition?... 4 Market Trends... 6 Factors driving the need for a Solution... 8 The Solution... 10

More information

A Study on Motion-Based UI for Running Games with Kinect

A Study on Motion-Based UI for Running Games with Kinect A Study on Motion-Based UI for Running Games with Kinect Jimin Kim, Pyeong Oh, Hanho Lee, Sun-Jeong Kim * Interaction Design Graduate School, Hallym University 1 Hallymdaehak-gil, Chuncheon-si, Gangwon-do

More information

Touch Probe Cycles itnc 530

Touch Probe Cycles itnc 530 Touch Probe Cycles itnc 530 NC Software 340 420-xx 340 421-xx User s Manual English (en) 4/2002 TNC Models, Software and Features This manual describes functions and features provided by the TNCs as of

More information

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088 Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher

More information

How to Create a Touchless Slider for Human Interface Applications

How to Create a Touchless Slider for Human Interface Applications How to Create a Touchless Slider for Human Interface Applications By Steve Gerber, Director of Human Interface Products Silicon Laboratories Inc., Austin, TX Introduction Imagine being able to control

More information

ZX Distance and Gesture Sensor Hookup Guide

ZX Distance and Gesture Sensor Hookup Guide Page 1 of 13 ZX Distance and Gesture Sensor Hookup Guide Introduction The ZX Distance and Gesture Sensor is a collaboration product with XYZ Interactive. The very smart people at XYZ Interactive have created

More information

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY J. C. Álvarez, J. Lamas, A. J. López, A. Ramil Universidade da Coruña (SPAIN) carlos.alvarez@udc.es, jlamas@udc.es, ana.xesus.lopez@udc.es,

More information

Touch Probe Cycles TNC 426 TNC 430

Touch Probe Cycles TNC 426 TNC 430 Touch Probe Cycles TNC 426 TNC 430 NC Software 280 472-xx 280 473-xx 280 474-xx 280 475-xx 280 476-xx 280 477-xx User s Manual English (en) 6/2003 TNC Model, Software and Features This manual describes

More information

Lamb Wave Ultrasonic Stylus

Lamb Wave Ultrasonic Stylus Lamb Wave Ultrasonic Stylus 0.1 Motivation Stylus as an input tool is used with touchscreen-enabled devices, such as Tablet PCs, to accurately navigate interface elements, send messages, etc. They are,

More information

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project Digital Interactive Game Interface Table Apps for ipad Supervised by: Professor Michael R. Lyu Student: Ng Ka Hung (1009615714) Chan Hing Faat (1009618344) Year 2011 2012 Final Year Project Department

More information

IMU: Get started with Arduino and the MPU 6050 Sensor!

IMU: Get started with Arduino and the MPU 6050 Sensor! 1 of 5 16-3-2017 15:17 IMU Interfacing Tutorial: Get started with Arduino and the MPU 6050 Sensor! By Arvind Sanjeev, Founder of DIY Hacking Arduino MPU 6050 Setup In this post, I will be reviewing a few

More information

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction Piyush Kumar 1, Jyoti Verma 2 and Shitala Prasad 3 1 Department of Information Technology, Indian Institute of Information Technology,

More information

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface Weihang Zhu and Yuan-Shin Lee* Department of Industrial Engineering North Carolina State University,

More information