Available online at ScienceDirect. Procedia Computer Science 109C (2017) 59 66
|
|
- Hilary Gilbert
- 5 years ago
- Views:
Transcription
1 Available online at ScienceDirect Procedia Computer Science 109C (2017) The 8th International Conference on Ambient Systems, Networks and Technologies (ANT 2017) Curved - free-form interaction using capacitive proximity sensors Andreas Braun a,b *, Sebastian Zander-Walz a, Martin Majewski a, Arjan Kuijper a,b a Fraunhofer Institute for Computer Graphics Research IGD, Fraunhoferstr. 5, Darmstadt, Germany b Technische Universität Darmstadt, Karolinenplatz 5, Darmstadt, Germany Abstract Large interactive surfaces have found increased popularity in recent years. However, with increased surface size ergonomics become more important, as interacting for extended periods may cause fatigue. Curved is a large-surface interaction device, designed to follow the natural movement of a stretched arm when performing gestures. It tracks one or two hands above the surface, using an array of capacitive proximity sensors and supports both touch and mid-air gestures. It requires specific object tracking methods and the synchronized measurement from 32 sensors. We have created an example application for users wearing a virtual reality headset while seated that may benefit from haptic feedback and ergonomically shaped surfaces. A prototype with adaptive curvature has been created that allows us to evaluate gesture recognition performance and different surface inclinations The Authors. Published by Elsevier B.V. Peer-review under responsibility of the Conference Program Chairs. Keywords: Curved surfaces; capacitive sensing; gestural interaction; virtual reality 1. Main text Large interactive surfaces have been a research interest for several decades, with various applications in research, commerce, and industry. Typically, they are controlled using various touch gestures. However, there are some varieties that enable hand tracking above the surface 1,2. The last few years have seen a revived interest in virtual reality (VR) systems. Sensor-augmented headsets improve the user experience and numerous commercial systems either have arrived on the market, or are about to do so. * Corresponding author. Tel.: ; fax: address: andreas.braun@igd.fraunhofer.de The Authors. Published by Elsevier B.V. Peer-review under responsibility of the Conference Program Chairs /j.procs
2 60 Andreas Braun et al. / Procedia Computer Science 109C (2017) Fig. 1. Person interacting with Curved in a VR application Visual cues are an important aspect in human-computer interaction (HCI). Knowing the position of interaction elements is achieved, either by visual identification or by learning their positions. There is no straightforward way to provide these for VR systems. Most commonly this limitation is overcome by relying on haptic feedback devices, or providing visual feedback in the virtual world 3. Combining interactive surfaces and VR has a number of potential applications, ranging from 3D manipulation to configurable haptic user interfaces. However, so far these surfaces rarely consider the body kinematics and remain flat or shaped non-ergonomically 4. In applications, where prolonged use is necessary, a device that allows for haptic feedback can help reducing fatigue, by letting the user rest his hands. This is not directly supported by common hand tracking devices, such as the Leap Motion or the Microsoft Kinect 5,6. In this work we present Curved - an interaction device with a curved surface, based on capacitive proximity sensors (Fig. 1). It is able to detect presence and position of one or two hands at a distance of approximately 20 centimeters from the surface and distinguishes mid-air and touch gestures. Curved is comprised of eight modules with four capacitive sensors each. The inclination of the modules can be adapted, so it is suited for persons sitting or standing in front of it. We have created a demonstration application in VR that visualizes the effects of different lighting systems and their impact on the energy consumption in a home environment. In addition, the gesture classification module was tested with ten users. In short, we propose the following scientific contributions: A novel free-form interaction system, whose shape follows the kinematic range of a person s hands, whose shape is adaptive for sitting/standing persons and supports touch/mid-air gestures, and a VR demonstration application for visualizing energy consumption and user evaluation 2. Related Works Curved display systems have been of commercial and research interest in the past few years. Improved display technologies, such as OLED or eink can be applied more flexibly, which has resulted in various curved displays on the market, from small smartphone screens to home theater systems 7,8. The University of Groningen created the Reality Touch Theater - a large area system with a curved projection area that supports touch interaction from multiple users that move around the area 9. The system is designed for collaborative multi-user interaction with the vertical surface, not taking into account individual kinematic constraints. Researchers have been active in creating large-area interaction systems for standing or sitting persons. Rekimoto used an array of capacitive sensors to build SmartSkin, an early example of a flat interactive surface 10. More
3 Andreas Braun et al. / Procedia Computer Science 109C (2017) recently Annett et al. created Medusa, a combination of a regular touch surface and an array of IR sensors for proximity-aware interaction 4. The example applications for Medusa included user differentiation and combined touch & proximity gestures. However, this system is flat, so users have to stretch to reach different parts. Modern sensor systems enable the creation of irregularly shaped or even deformable interaction systems. Wimmer and Baudisch use time-domain reflectometry to detect the touch at a certain position of a wire 11. By applying this wire on a soft silicone surface, they are able to create stretchable interactive surfaces. Roudaut et al. evaluated user preference when interacting with curved interactive surfaces 12. They created some guidelines for device and interface designers, on how to adapt the systems based on the chosen curvature. The created shapes are small and primarily aimed towards interaction with fingers or a single hand. Capacitive proximity sensors detect the presence and activities of a human body at a distance. A classic example is the Gesture Wall - an interactive art installation controlled by hand movements in front of spherical antennas 13. Recent examples explored using transparent electrodes and new processing methods for tracking fingers and hands 14,15. Bachynsky et al. have recently started investigating the ergonomics of various user interfaces based on biomechanical simulation 16,17. They propose that a lower amount of kinetic energy used when interacting, can increase efficiency and acceptance of HCI systems. This can be simulated or measured, e.g. by using motion capture systems, leading to increased research into devices that are more efficient to use. Even though many systems are aimed for gaming, VR has a high potential in work environments 18. Systems, such as Curved can be another step towards the acceptance of VR systems in the work environment, by reducing necessary upper body movement, avoiding fatigue and increasing intuitiveness of use. 3. Curved surface interaction Fig. 2. Left - Kinematic range of hands with static shoulders. Right - rotational 3D model of arm range The first step in the design of Curved is finding the proper layout of the interaction system. We consider a person wearing VR goggles and is sitting or standing in front of the interactive surface. In order to minimize the required movement of the body, we consider that the shoulders should stay static. The arms and hands can move freely. A simple two-dimensional kinematic model of this is shown in Fig. 2 on the left. The red line is the convex hull created when the hands are stretched out. There is an overlapping area in the middle as indicated by the dashed line. This area right in front of the user can be reached by both hands equally. This model can be transferred into three dimensions. While the shoulders remain static, the arms are rotated around all joints. As a simplification the overlapping area in front of the user is modeled as a flat surface. The resulting body is the convex hull of the area that can be reached by outstretched arms and hands. It is shown in Fig. 2 on the right. The resulting rotational body is the template for Curved. Some limitations apply for practical purposes. The lower part of this rotational body need to provide space for the person sitting or standing in front of it. Therefore, an area should be left open that leaves sufficient space for the user. The second limitation is based on the requirement for sitting and standing interaction. The position of the shoulder relative to Curved is considerably different for each
4 62 Andreas Braun et al. / Procedia Computer Science 109C (2017) case. Either the system has to be moved up and down, or the surface has to adapt towards the current posture. Finally, the whole rotational body is very large when installed. Therefore, the interaction area can be reduced by not providing the full 180 coverage. The final geometric design is shown in Fig. 3. Curved is comprised of eight modules, that are hinged at the side nearest to the user. The back side can be moved up and down and locked into several positions, thus fixing their inclination. A low inclination is suitable for a standing user that wants to interact with his arms down (e.g. in Fig. 1), whereas a higher inclination is more suitable for sitting users. There is a circular cutout in the center to accommodate a standing or sitting user. The geometry of the Curved system resembles a reduced polygon model of the rotational body. This reduces complexity, while keeping a mechanically feasible system. Each module is equipped with four loading mode capacitive proximity sensors that use copper foil electrodes. They can track the presence and position of hands on or above the surface. 4. Prototype Fig. 3. Concept sketch of the final Curved design. Fig. 4. Left: overall view of Curved prototype, center: copper electrodes and sensor of a single module, right: detail view of mechanical joints and connected sensors The Curved prototype system is based on the OpenCapSense toolkit for capacitive proximity sensing 19. This toolkit has been used for various interaction systems in the past 20,21. Four boards are synchronized over CAN bus using a customized firmware. This synchronization is necessary, since adjacent boards may disturb one another. If an electrode is charged next to one that is discharged, the measurement may be affected. This synchronization is a feature for the eight sensors connected to a single board, but was unsupported over multiple boards. In our case we perform simultaneous measurements of modules 1, 2, 5, and 6 (as shown in Fig. 4 - left), as the two connecting boards are sufficiently far away from one another. Once the measurement is completely, modules 3, 4, 7, and 8 are used. This limits the effective sampling rate from 20Hz to 10Hz. Each board interfaces the sensors of two modules. For each module we use an acrylic plate as the base. Four copper foil electrodes are glued on and connected via a soldered wire to the small sensor boards (Fig. 4 - center). Each of those modules is then attached to the joint structure that allows modifying the inclination of the system between 10 and 25 (Fig. 4 - right). The sensors are finally connected to the main OpenCapSense board using USB cables. Finally, the eight modules are hoisted on a wooden frame (Fig. 4 - left).
5 Andreas Braun et al. / Procedia Computer Science 109C (2017) The Curved system is connected over USB-to-Serial interfaces to a PC that performs the previously mentioned calibration, baselining and drift compensation, as well as the generation of capacitive images and gesture classification. The software is implemented in Java, using OpenCV for any image processing and the Weka framework for gesture classification (using the libsvm module). The recognized gesture can be used by any application for interaction. We have created a demonstration application in VR that visualizes energy consumption choices in a Smart Home environment. 5. Data processing Fig 5. Processing of integer capacitance values (a) to capacitive images (b) and recognized objects and shapes (c) The loading mode sensors provide a capacitance measurement that is mapped into a bit value by an analog-digital converter. We apply basic methods of baseline creation, calibration and drift compensation for a stable signal 22. For further analysis, including gesture recognition, we use the geometric position of the sensors and the capacitance value to generate a capacitive image Generating capacitive images The process of creating a capacitive image is shown in Fig. 5. The integer values from the sensor can be considered as a low-resolution grayscale image. We have the option to use this image directly for object recognition or first apply additional image processing methods. Fig. 5 (c) shows an image that is scaled up with bicubic interpolation and converted to a binary image using a threshold. The small red circle marks the center of the object, which is calculated using image moments 23. A new image is acquired from sensor data every 50 ms. The algorithm clusters and identifies several objects in proximity of, or touching Curved. The analysis of subsequent centers is used as a basis for the gesture recognition. Fig. 6. Gesture categories supported by Curved
6 64 Andreas Braun et al. / Procedia Computer Science 109C (2017) Gesture recognition Curved supports three different types of gestures that are shown in Fig. 6. Each can be performed by either touching the surface or moving the hands through the interaction space of approximately 20 cm above the surface. The system can also be configured to only support touch or mid-air interaction. The first gesture category is pointing gestures (a) performed by holding the hand in the interaction area for a time of more than 300 ms. The second category is one-handed swipes (b) in eight directions. This again can be performed on or off the surface. The last category is two-handed pinch-and-zoom gestures (c) that are detected if two objects are present and move their locations towards one another or vice versa. Here we distinguish six varieties (horizontal and diagonal movements, centers moving closer/getting further away). The gestures are classified, if six consecutive images (duration 300 ms) have found objects. A support vector machine (SVM) classification is used, with the relative position of six object centers used as features (twelve centers in case of two-handed gestures). In case that two objects are present, only pinch-to-zoom gestures are enabled. SVMs are often used in discrete classification tasks and have a good performance 24. Overall, there are 15 different gestures that can be performed. The system was trained by two users and 20 samples for each gesture. 6. Interaction in Virtual Reality environments A main application area that we see for Curved is the interaction in scenarios, where there is no visual reference for the position of the user s hands. Most notably this is the case in VR, when a headset is worn. In the last few years, several of those systems have been released as prototypes, ranging from smartphone-based varieties, such as Google Cardboard, to PC-based systems, such as the Oculus Rift. Our application was implemented on the latter, using the available development kits. Fig. 7. Left - Traffic light system for energy usage and current energy consumption. Right - visualizing effects of different lighting scenarios (top: lamp off, middle: fluorescent lighting, bottom: LED lighting) 6.1. Making Energy Visible application To explore energy usage and to enable the user to understand the consequences of his actions and choices as well as the impact on the environment, the demonstrator consists of a rich virtual reality simulation. This simulation goes beyond displaying lists of metrics and charts of cash movement. It provides a comprehensive and easy to understand visualization approach (shown in Fig. 7 - left). It connects the chosen setup (the ensemble) of energy consuming devices (e.g. the type of light sources from fluorescent bulbs to modern LEDs) - entertainment sets or household appliances) and brings it into correlation with the user s environmental perception (shown in Fig. 7 - right). The environment is created using the Unity3D game engine. The input and output device management sub module is accompanying the Unity3D framework, by providing an abstraction layer for a variety of different input devices like gamepads, joysticks, mice and keyboards. Curved hooks into the C# scripting environment system of Unity3D and translates the input signals into Unity3D events.
7 Andreas Braun et al. / Procedia Computer Science 109C (2017) The user interacts with the environment using all supported gestures. The user selects different devices, by moving the head. The device can be selected using a pointing gesture, which opens the energy visualization menu. Here, we can move through the menu using left and right swipes. A pinch-gesture closes the menu again. 7. Evaluation A study was performed with 10 users (8m, 2f, average age 25.4). Most participants had previous experience with gesture interaction systems. We wanted to evaluate the gesture recognition performance and the preference for inclination when using the system. The users were shown each gesture once and had no specific training time. The participants performed gestures in two different inclinations (10 and 25 ), both in touch and mid-air, and a subset of the supported gestures (one point, three swipes, two pinch-to-zoom). The inclinations were chosen, due to constraints of our structure, and the gestures were chosen based on similarity to each other, whereas adding more gestures would have led to similar results. Of these gestures, each gesture was performed five times. Finally, the participants were asked to fill a questionnaire. Overall 1200 gestures were performed during the evaluation. The system was pre-trained by two users, a day before the evaluation with a training set of 300 gestures. They were not participating in the actual study. The accuracy of the classification is shown in Fig. 8. With the exception of swipes to the left, we achieved a good accuracy. The overall average accuracy was 81% (89% if left swipes are excluded). The swipes are often misclassified as right swipes, since the hand tends to move inside the supported interaction space, thus sometimes doing a right swipe before doing a left swipe. This was strongly depending on the participant. In general, it is difficult to distinguish between touch and mid-air gesture accuracy. Both methods performed similar in our evaluation (touch=83%, mid-air=79%), but the difference is not statistically significant (p=.32). The same applies to the inclination angles. While the high inclination (83%) performed a little bit better than the low inclination (79%), the result is not statistically significant (p=.31). Finally, in the questionnaire we inquired about preference of inclination (multiple-choice), of touch vs. mid-air (10-scale Likert, 1=touch, 10=mid-air) and how tiring the interaction was (10-scale Likert, 1=not tiring, 10=very tiring). All ten participants preferred the high-inclination setting, as the evaluation was performed standing. They slightly preferred mid-air interaction (µ=6.6, σ=3.2) and did not consider the system very tiring (µ=4.5, σ=2.2). 8. Conclusion & Future Works Fig. 8. Classification accuracy of touch and mid-air gestures at low and high inclination In this work we have presented Curved - a free-form interaction device, based on capacitive proximity sensors that supports touch and mid-air gestural interaction. We consider it particularly suited for VR applications. We have presented the process of creating the ergonomic surface shape and how we created a prototype from it. A software application was programmed that illustrates the VR use case, by combining Curved with the Oculus Rift. This application demonstrates the effects of device usage on energy consumption and how different lighting technologies
8 66 Andreas Braun et al. / Procedia Computer Science 109C (2017) influence the environment. The evaluation showed that the gesture recognition is performing sufficiently well and that users prefer a high inclination setting for interaction. In the future we would like to focus our research on two areas. Additive manufacturing, in particular functional 3D printing, offers the opportunity to integrate conductive material into printed objects. This would enable additional devices and use cases. The concept of ergonomic surface design can be extended to other constrained shapes. For example, if we consider a static elbow or static finger joints, we could create much smaller interaction systems that can be added to a desk. Acknowledgements We would like to thank Stefan Krepp, Steeven Zeiss, and Maxim Djakow for their input to the hardware and software development, as well as our study participants for their valuable comments. This work was partially supported by EC Grant Agreement No References 1. H. Benko, R. Jota and A. Wilson Miragetable: freehand interaction on a projected augmented reality tabletop. Proceedings CHI 12, R. Wimmer, M. Kranz, S. Boring and A. Schmidt Captable and capshelf-unobtrusive activity recognition using networked capacitive sensors. Proceedings INSS 07, R.A. Earnshaw Virtual reality systems. Academic press. 4. M. Annett, T. Grossman, D. Wigdor and G. Fitzmaurice Medusa: a proximity-aware multi-touch tabletop. Proceedings UIST 11, F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, Analysis of the Accuracy and Robustness of the Leap Motion Controller, Sensors, vol. 13, no. 5, pp , Z. Ren, J. Meng, J. Yuan, and Z. Zhang, Robust hand gesture recognition with kinect sensor, in Multimedia, 2011, Samsung Inc. Samsung Introduces the Latest in its Iconic Note Series - The Galaxy Note 4, and Showcases Next Generation Display with Galaxy Note Edge Retrieved March 31 st 2016 from Latest-in-its-Iconic-Note-Series---The-Galaxy-Note-4,-and-Showcases-Next-Generation-Display-with-Galaxy-Note-Edge Samsung Inc. CES 2014: Samsung Unveils First Curved Ultra High Definition (UHD) TVs Retrieved March 31 st 2016 from 9. University of Groningen. Reality Touch Theatre Retrieved March 31 st 2016 from J. Rekimoto SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. Proceedings CHI 02, R. Wimmer and P. Baudisch Modular and deformable touch-sensitive surfaces based on time domain reflectometry. Proceedings UIST 11, A. Roudaut, H. Pohl and P. Baudisch Touch input on curved surfaces. Proceedings CHI 11, J. Smith, T. White, C. Dodge, J. Paradiso, N. Gershenfeld and D. Allport Electric field sensing for graphical interfaces. IEEE Computer Graphics and Applications 18, 3, M. Le Goc, S. Taylor, S. Izadi, C. Keskin and others A Low-cost Transparent Electric Field Sensor for 3D Interaction on Mobile Devices. Proceedings CHI 14, T. Grosse-Puppendahl, A. Braun, F. Kamieth and A. Kuijper Swiss-cheese extended: an object recognition method for ubiquitous interfaces based on capacitive proximity sensing. Proceedings CHI 13, M. Bachynskyi, A. Oulasvirta, G. Palmas, G and T. Weinkauf Is motion capture-based biomechanical simulation valid for HCI studies?: study and implications. Proceedings CHI 14, M. Bachynskyi, G. Palmas, A. Oulasvirta, J. Steimle and T. Weinkauf Performance and Ergonomics of Touch Surfaces: A Comparative Study Using Biomechanical Simulation. Proceedings CHI 15, D. Ma, X. Fan, J. Gausemeier and M. Grafe, eds Virtual Reality & Augmented Reality in Industry. Springer Berlin Heidelberg, Berlin, Heidelberg. 19. T. Grosse-Puppendahl, Y. Berghoefer, A. Braun, R. Wimmer and A. Kuijper OpenCapSense: A rapid prototyping toolkit for pervasive interaction using capacitive sensing. Proceedings PerCom 2013, A. Braun, S. Zander-Walz, S. Krepp, S. Rus, R. Wichert, and A. Kuijper CapTap - Combining Capacitive Gesture Recognition and Knock Detection. Proceedings iwoar 2016, Article No A. Braun and P. Hamisu, Designing a multi-purpose capacitive proximity sensing input device, in Proceedings PETRA, 2011, Article No A. Braun, R. Wichert, A. Kuijper and D.W. Fellner Capacitive proximity sensing in smart environments. Journal of Ambient Intelligence and Smart Environments 7, 4 (2015), M.-K. Hu Visual pattern recognition by moment invariants. IRE Transactions on Information Theory 8, 2, A. Braun, S. Krepp and A. Kuijper Acoustic Tracking of Hand Activities on Surfaces. Proceedings iwoar 2015, Article No. 9.
CapTap - Combining Capacitive Gesture Recognition and Acoustic Touch Detection
CapTap - Combining Capacitive Gesture Recognition and Acoustic Touch Detection Andreas Braun 12, Sebastian Zander-Walz 1, Stefan Krepp 2, Silvia Rus 1, Reiner Wichert 3, Arjan Kuijper 12 1 Fraunhofer IGD,
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationExploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity
Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Adiyan Mujibiya The University of Tokyo adiyan@acm.org http://lab.rekimoto.org/projects/mirage-exploring-interactionmodalities-using-off-body-static-electric-field-sensing/
More informationDepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface
DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationImage Manipulation Interface using Depth-based Hand Gesture
Image Manipulation Interface using Depth-based Hand Gesture UNSEOK LEE JIRO TANAKA Vision-based tracking is popular way to track hands. However, most vision-based tracking methods can t do a clearly tracking
More informationOpenCapSense: A Rapid Prototyping Toolkit for Pervasive Interaction Using Capacitive Sensing
OpenCapSense: A Rapid Prototyping Toolkit for Pervasive Interaction Using Capacitive Sensing Tobias Grosse-Puppendahl, Yannick Berghoefer, Andreas Braun, Raphael Wimmer, Arjan Kuijper Fraunhofer Institute
More informationOpenCapSense: A Rapid Prototyping Toolkit for Pervasive Interaction Using Capacitive Sensing
2013 IEEE International Conference on Pervasive Computing and Communications (PerCom), San Diego (18--22 March 2013) OpenCapSense: A Rapid Prototyping Toolkit for Pervasive Interaction Using Capacitive
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationShopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction
Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction Minghao Cai 1(B), Soh Masuko 2, and Jiro Tanaka 1 1 Waseda University, Kitakyushu, Japan mhcai@toki.waseda.jp, jiro@aoni.waseda.jp
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More informationEnabling Cursor Control Using on Pinch Gesture Recognition
Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationMulti-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit Alan Esenther and Kent Wittenburg TR2005-105 September 2005 Abstract
More informationContext-based bounding volume morphing in pointing gesture application
Context-based bounding volume morphing in pointing gesture application Andreas Braun 1, Arthur Fischer 2, Alexander Marinc 1, Carsten Stocklöw 1, Martin Majewski 2 1 Fraunhofer Institute for Computer Graphics
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More informationFrom Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness
From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness Alaa Azazi, Teddy Seyed, Frank Maurer University of Calgary, Department of Computer Science
More informationDiploma Thesis Final Report: A Wall-sized Focus and Context Display. Sebastian Boring Ludwig-Maximilians-Universität München
Diploma Thesis Final Report: A Wall-sized Focus and Context Display Sebastian Boring Ludwig-Maximilians-Universität München Agenda Introduction Problem Statement Related Work Design Decisions Finger Recognition
More informationHARDWARE SETUP GUIDE. 1 P age
HARDWARE SETUP GUIDE 1 P age INTRODUCTION Welcome to Fundamental Surgery TM the home of innovative Virtual Reality surgical simulations with haptic feedback delivered on low-cost hardware. You will shortly
More informationGesture Recognition with Real World Environment using Kinect: A Review
Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationA Gestural Interaction Design Model for Multi-touch Displays
Songyang Lao laosongyang@ vip.sina.com A Gestural Interaction Design Model for Multi-touch Displays Xiangan Heng xianganh@ hotmail ABSTRACT Media platforms and devices that allow an input from a user s
More informationVirtual Grasping Using a Data Glove
Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct
More informationTOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017
TOUCH & FEEL VIRTUAL REALITY DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor
More informationMOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device
MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.
More informationDigitalisation as day-to-day-business
Digitalisation as day-to-day-business What is today feasible for the company in the future Prof. Jivka Ovtcharova INSTITUTE FOR INFORMATION MANAGEMENT IN ENGINEERING Baden-Württemberg Driving force for
More informationThe use of gestures in computer aided design
Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE,
More information1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany
1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany SPACE APPLICATION OF A SELF-CALIBRATING OPTICAL PROCESSOR FOR HARSH MECHANICAL ENVIRONMENT V.
More informationHand Gesture Recognition System Using Camera
Hand Gesture Recognition System Using Camera Viraj Shinde, Tushar Bacchav, Jitendra Pawar, Mangesh Sanap B.E computer engineering,navsahyadri Education Society sgroup of Institutions,pune. Abstract - In
More informationVirtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21
Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:
More informationResearch on Hand Gesture Recognition Using Convolutional Neural Network
Research on Hand Gesture Recognition Using Convolutional Neural Network Tian Zhaoyang a, Cheng Lee Lung b a Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China E-mail address:
More informationiwindow Concept of an intelligent window for machine tools using augmented reality
iwindow Concept of an intelligent window for machine tools using augmented reality Sommer, P.; Atmosudiro, A.; Schlechtendahl, J.; Lechler, A.; Verl, A. Institute for Control Engineering of Machine Tools
More informationInteractive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience
Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience Radu-Daniel Vatavu and Stefan-Gheorghe Pentiuc University Stefan cel Mare of Suceava, Department of Computer Science,
More informationInvestigating Gestures on Elastic Tabletops
Investigating Gestures on Elastic Tabletops Dietrich Kammer Thomas Gründer Chair of Media Design Chair of Media Design Technische Universität DresdenTechnische Universität Dresden 01062 Dresden, Germany
More informationUser s handbook Last updated in December 2017
User s handbook Last updated in December 2017 Contents Contents... 2 System info and options... 3 Mindesk VR-CAD interface basics... 4 Controller map... 5 Global functions... 6 Tool palette... 7 VR Design
More informationLOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR
LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We
More informationInteraction Design for the Disappearing Computer
Interaction Design for the Disappearing Computer Norbert Streitz AMBIENTE Workspaces of the Future Fraunhofer IPSI 64293 Darmstadt Germany VWUHLW]#LSVLIUDXQKRIHUGH KWWSZZZLSVLIUDXQKRIHUGHDPELHQWH Abstract.
More informationSensing Human Activities With Resonant Tuning
Sensing Human Activities With Resonant Tuning Ivan Poupyrev 1 ivan.poupyrev@disneyresearch.com Zhiquan Yeo 1, 2 zhiquan@disneyresearch.com Josh Griffin 1 joshdgriffin@disneyresearch.com Scott Hudson 2
More informationVIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR
VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR softvis@uni-leipzig.de http://home.uni-leipzig.de/svis/vr-lab/ VR Labor Hardware Portfolio OVERVIEW HTC Vive Oculus Rift Leap Motion
More informationThe Hand Gesture Recognition System Using Depth Camera
The Hand Gesture Recognition System Using Depth Camera Ahn,Yang-Keun VR/AR Research Center Korea Electronics Technology Institute Seoul, Republic of Korea e-mail: ykahn@keti.re.kr Park,Young-Choong VR/AR
More informationToward an Augmented Reality System for Violin Learning Support
Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp
More informationDEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1
DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor AMD Phenom II or Intel Core i3 processor or greater Memory
More informationDirect gaze based environmental controls
Loughborough University Institutional Repository Direct gaze based environmental controls This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,
More informationAn Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment
An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment Mohamad Shahrul Shahidan, Nazrita Ibrahim, Mohd Hazli Mohamed Zabil, Azlan Yusof College of Information Technology,
More informationHARDWARE SETUP GUIDE. 1 P age
HARDWARE SETUP GUIDE 1 P age INTRODUCTION Welcome to Fundamental Surgery TM the home of innovative Virtual Reality surgical simulations with haptic feedback delivered on low-cost hardware. You will shortly
More informationDesign and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone
ISSN (e): 2250 3005 Volume, 06 Issue, 11 November 2016 International Journal of Computational Engineering Research (IJCER) Design and Implementation of the 3D Real-Time Monitoring Video System for the
More informationVirtual Reality in Neuro- Rehabilitation and Beyond
Virtual Reality in Neuro- Rehabilitation and Beyond Amanda Carr, OTRL, CBIS Origami Brain Injury Rehabilitation Center Director of Rehabilitation Amanda.Carr@origamirehab.org Objectives Define virtual
More informationCollaboration on Interactive Ceilings
Collaboration on Interactive Ceilings Alexander Bazo, Raphael Wimmer, Markus Heckner, Christian Wolff Media Informatics Group, University of Regensburg Abstract In this paper we discuss how interactive
More informationFabrication of the kinect remote-controlled cars and planning of the motion interaction courses
Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion
More informationCS415 Human Computer Interaction
CS415 Human Computer Interaction Lecture 10 Advanced HCI Universal Design & Intro to Cognitive Models October 30, 2016 Sam Siewert Summary of Thoughts on ITS Collective Wisdom of Our Classes (2015, 2016)
More informationQuality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies
Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies Mirko Sužnjević, Maja Matijašević This work has been supported in part by Croatian Science Foundation
More informationUbiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1
Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility
More informationRobust Hand Gesture Recognition for Robotic Hand Control
Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State
More informationSensor system of a small biped entertainment robot
Advanced Robotics, Vol. 18, No. 10, pp. 1039 1052 (2004) VSP and Robotics Society of Japan 2004. Also available online - www.vsppub.com Sensor system of a small biped entertainment robot Short paper TATSUZO
More informationDiamondTouch SDK:Support for Multi-User, Multi-Touch Applications
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications Alan Esenther, Cliff Forlines, Kathy Ryall, Sam Shipman TR2002-48 November
More informationSynergy Model of Artificial Intelligence and Augmented Reality in the Processes of Exploitation of Energy Systems
Journal of Energy and Power Engineering 10 (2016) 102-108 doi: 10.17265/1934-8975/2016.02.004 D DAVID PUBLISHING Synergy Model of Artificial Intelligence and Augmented Reality in the Processes of Exploitation
More informationGesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS
Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,
More information3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray
Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User
More informationOcclusion-Aware Menu Design for Digital Tabletops
Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at
More informationRemote Shoulder-to-shoulder Communication Enhancing Co-located Sensation
Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Minghao Cai and Jiro Tanaka Graduate School of Information, Production and Systems Waseda University Kitakyushu, Japan Email: mhcai@toki.waseda.jp,
More informationUsing sound levels for location tracking
Using sound levels for location tracking Sasha Ames sasha@cs.ucsc.edu CMPE250 Multimedia Systems University of California, Santa Cruz Abstract We present an experiemnt to attempt to track the location
More informationDouble-side Multi-touch Input for Mobile Devices
Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan
More informationWands are Magic: a comparison of devices used in 3D pointing interfaces
Wands are Magic: a comparison of devices used in 3D pointing interfaces Martin Henschke, Tom Gedeon, Richard Jones, Sabrina Caldwell and Dingyun Zhu College of Engineering and Computer Science, Australian
More informationEnvironmental control by remote eye tracking
Loughborough University Institutional Repository Environmental control by remote eye tracking This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,
More informationF210 Vision Sensor Flow Menus and Macro Capability
F210 Vision Sensor Flow Menus and Macro Capability Reduce the overhead involved in system planning and introduction. The Three Features of Flow Menus and Macros The Flow Menus and Macros of the F500-UM3FE/UM3ME
More informationComparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application
Comparison of Head Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application Nehemia Sugianto 1 and Elizabeth Irenne Yuwono 2 Ciputra University, Indonesia 1 nsugianto@ciputra.ac.id
More informationVirtual prototyping based development and marketing of future consumer electronics products
31 Virtual prototyping based development and marketing of future consumer electronics products P. J. Pulli, M. L. Salmela, J. K. Similii* VIT Electronics, P.O. Box 1100, 90571 Oulu, Finland, tel. +358
More informationMulti-touch Interface for Controlling Multiple Mobile Robots
Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate
More informationSIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB
SIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB S. Kajan, J. Goga Institute of Robotics and Cybernetics, Faculty of Electrical Engineering and Information Technology, Slovak University
More informationMEASURING AND ANALYZING FINE MOTOR SKILLS
MEASURING AND ANALYZING FINE MOTOR SKILLS PART 1: MOTION TRACKING AND EMG OF FINE MOVEMENTS PART 2: HIGH-FIDELITY CAPTURE OF HAND AND FINGER BIOMECHANICS Abstract This white paper discusses an example
More informationExploring Surround Haptics Displays
Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,
More informationCOMET: Collaboration in Applications for Mobile Environments by Twisting
COMET: Collaboration in Applications for Mobile Environments by Twisting Nitesh Goyal RWTH Aachen University Aachen 52056, Germany Nitesh.goyal@rwth-aachen.de Abstract In this paper, we describe a novel
More informationComposite Body-Tracking:
Composite Body-Tracking: Device Abstraction Layer with Data Fusion for Gesture Recognition in Virtual Reality Applications Vortragender: Betreuer: Verantwortlicher Professor: Luis Alejandro Rojas Vargas
More informationCurriculum Vitae. Contact Information. Work Experience. Education. Patents
Curriculum Vitae Tobias Grosse-Puppendahl Dr. Ing. h.c. F. Porsche AG Porschestr. 911 71287 Weissach, Germany Contact Information E-Mail: Date of Birth: 6th April 1986 Place of Birth: Nationality: Languages:
More informationStep. A Big Step Forward for Virtual Reality
Step A Big Step Forward for Virtual Reality Advisor: Professor Goeckel 1 Team Members Ryan Daly Electrical Engineer Jared Ricci Electrical Engineer Joseph Roberts Electrical Engineer Steven So Electrical
More informationMETIS EDS GAMMA. Metis desktop professional scanner : great quality, fast and easy to use!
Metis desktop professional scanner : great quality, fast and easy to use! The V-Table design The integrates an innovative V-Table design supported by specific software tools in order to hold the originals
More informationSimulation of Tangible User Interfaces with the ROS Middleware
Simulation of Tangible User Interfaces with the ROS Middleware Stefan Diewald 1 stefan.diewald@tum.de Andreas Möller 1 andreas.moeller@tum.de Luis Roalter 1 roalter@tum.de Matthias Kranz 2 matthias.kranz@uni-passau.de
More informationOperating Virtual Panels with Hand Gestures in Immersive VR Games
Operating Virtual Panels with Hand Gestures in Immersive VR Games Experiences with the Leap Motion Controller Yin Zhang and Oscar Meruvia-Pastor Department of Computer Science, Memorial University of Newfoundland,
More informationMECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES
INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL
More informationR (2) Controlling System Application with hands by identifying movements through Camera
R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationPRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1
PRODUCTS DOSSIER DEVELOPMENT KIT - VERSION 1.1 - NOVEMBER 2017 www.neurodigital.es / hello@neurodigital.es Product information PAGE 1 Minimum System Specs Operating System Windows 8.1 or newer Processor
More informationCONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM
CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,
More informationHead Tracking for Google Cardboard by Simond Lee
Head Tracking for Google Cardboard by Simond Lee (slee74@student.monash.edu) Virtual Reality Through Head-mounted Displays A head-mounted display (HMD) is a device which is worn on the head with screen
More informationECC419 IMAGE PROCESSING
ECC419 IMAGE PROCESSING INTRODUCTION Image Processing Image processing is a subclass of signal processing concerned specifically with pictures. Digital Image Processing, process digital images by means
More informationPortfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088
Portfolio About Me: I am a Computer Science graduate student at The University of Texas at Dallas. I am currently working as Augmented Reality Engineer at Aireal, Dallas and also as a Graduate Researcher
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More informationDesign a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison
e-issn 2455 1392 Volume 2 Issue 10, October 2016 pp. 34 41 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Design a Model and Algorithm for multi Way Gesture Recognition using Motion and
More informationHumantenna. ubicomp lab. Using the Human Body as an Antenna for Real-Time Whole-Body Interaction
Humantenna Using the Human Body as an Antenna for Real-Time Whole-Body Interaction Gabe Cohn 1,2 Dan Morris 1 Shwetak N. Patel 1,2 Desney S. Tan 1 1 Microsoft Research 2 University of Washington MSR Faculty
More informationA Real Time Static & Dynamic Hand Gesture Recognition System
International Journal of Engineering Inventions e-issn: 2278-7461, p-issn: 2319-6491 Volume 4, Issue 12 [Aug. 2015] PP: 93-98 A Real Time Static & Dynamic Hand Gesture Recognition System N. Subhash Chandra
More informationAbstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction
Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri
More informationCOMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES
International Journal of Advanced Research in Engineering and Technology (IJARET) Volume 9, Issue 3, May - June 2018, pp. 177 185, Article ID: IJARET_09_03_023 Available online at http://www.iaeme.com/ijaret/issues.asp?jtype=ijaret&vtype=9&itype=3
More informationControlling Humanoid Robot Using Head Movements
Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika
More informationGroup #17 Arian Garcia Javier Morales Tatsiana Smahliuk Christopher Vendette
Group #17 Arian Garcia Javier Morales Tatsiana Smahliuk Christopher Vendette Electrical Engineering Electrical Engineering Electrical Engineering Electrical Engineering Contents 1 2 3 4 5 6 7 8 9 Motivation
More informationLocalized Space Display
Localized Space Display EE 267 Virtual Reality, Stanford University Vincent Chen & Jason Ginsberg {vschen, jasong2}@stanford.edu 1 Abstract Current virtual reality systems require expensive head-mounted
More informationGESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL
GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different
More informationHMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University
HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationContent Based Image Retrieval Using Color Histogram
Content Based Image Retrieval Using Color Histogram Nitin Jain Assistant Professor, Lokmanya Tilak College of Engineering, Navi Mumbai, India. Dr. S. S. Salankar Professor, G.H. Raisoni College of Engineering,
More information3D Data Navigation via Natural User Interfaces
3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship
More information