Development of K-Touch TM Haptic API for Various Datasets

Size: px
Start display at page:

Download "Development of K-Touch TM Haptic API for Various Datasets"

Transcription

1 Development of K-Touch TM Haptic API for Various Datasets Beom-Chan Lee 1 Jong-Phil Kim 2 Jongeun Cha 3 Jeha Ryu 4 ABSTRACT This paper presents development of a new haptic API (Application Programming Interface) that is called K-Touch TM haptic API. This API can provide users with haptic interaction for various data representations such as 2D, 2.5D (height field), and 3D dynamic as well as static object data. In addition, it supports kinesthetic and tactile interactions simultaneously for more versatile haptic interaction. It is made of scalable software architecture and provides more convenient software interface for users who are not familiar with haptic technology. Keywords: haptics, haptic API, kinesthetic, tactile, hybrid environments 1 INTRODUCTION 1 In general sense, haptic means touch that provides users with tactual sensation of virtual, real or augmented environments. Haptic technology allows touching exploration and interactive manipulation of virtual objects through haptic interfaces. Haptic interaction can be roughly classified into two modalities: kinesthetic feedback (force, motion) and tactile display (cutaneous touch). The kinesthetic feedback encompasses perception of large scale details, such as object shape and mechanical properties such as compliance. This is achieved through way of feedback from the muscular and skeletal system. Meanwhile tactile perception is through the mechanoreceptive nerve ending in the skin. Thus, it is primarily means of relaying information regarding small scale details that cause skin stretch, compression and vibration. Kinesthetic feedback and tactile display allow users to have deeper recognition of virtual environments and enhance a sense of immersion in these environments [1]. In various application areas such as medical science, industry design, education and entertainment, therefore, efficient and effective haptic display systems and rendering algorithms have been proposed for supporting immersive experience. Haptic rendering algorithm is a process including collision detection between haptic probe and virtual object and contact force computation corresponding to the collision. In general, geometry-dependent haptic rendering algorithms depend on the geometry representation of the object being touched. These algorithms can be classified into two basic categories: surface haptic rendering algorithms for surface data [2-5] and volumetric haptic rendering algorithms for volumetric data [6-8]. Although various haptic rendering algorithms have been proposed, creation of a haptic application is difficult for users bclee@gist.ac.kr lowtar@gist.ac.kr gaecha@gist.ac.kr 4 ryu@gist.ac.k who are less familiar with the technology because the process of collision detection and force computation is fairly complicate for complex virtual and/or augmented environments. Therefore, some tools have been developed to meet the needs of the general or expert users [9-15]. Typically, these haptic tools allow a software engineer to create haptic applications for a certain software package, software framework, hardware platform, computer system, operating system. The GHOST SDK is developed specifically for PHANTOM haptic devices [9]. Although the GHOST SDK can create some haptic applications, there are some limitations for applying it to diverse areas. It is difficult for users to access haptic servo loop because architecture of the GHOST is encapsulated. In addition, since both device and architectural level are interlocked with each other, users cannot access to each level for extending the architecture. To overcome these limitations, they launched another haptic toolkit, OpenHaptics [9]. This has two levels API: HDAPI and HLAPI. HDAPI is a low-level foundational layer for haptics. It is best suited for developers who are familiar with haptic paradigms and sending forces directly. HLAPI is built on top of the HDAPI and provides a higher level control of haptics than HDAPI. HLAPI is developed on top of the well-established graphical OpenGL API. However it is still difficult to create derivative classes from the OpenHaptics architecture. The ReachIn API was commercialized as the first haptic and graphic API that is independent of a specific haptic device [10]. The ReachIn API is based on the scene-graph description file format VRML (Virtual Reality Modeling Language). Thus virtual environments, controlled by the ReachIn API, are defined in VRML, but the nodes of the scene-graph are built in C++. The ReachIn API can be used in conjunction with PHANTOM [9] and DELTA [11] haptic devices as well as their proprietary ReachIn display that supports graphic and haptic co-location. However the ReachIn requires high-end components for creating haptic systems such as high performance workstation PC with dual CPU, graphics hardware (over NVIDIA Quadro FX 1400), stereo display components. The e-touch API is the first API providing a set of open module haptic and graphic libraries [12]. The e-touch API supports both PHANTOM and DELTA devices and allows users to create haptic scenes featuring rigid meshes and simpler implicit-surface haptic objects. Moreover it allows users to create a haptic-based desktop environment where menus can be accessed using the haptic device, and supports the GHOST software toolkit. The e-touch API includes a full set of commands for creating interfaces, tools, and navigation techniques. The H3D TM API is a C++ implementation of an X3Dbased, open source scene-graph API for the development of haptic applications [13]. It closely follows the X3D design,

2 and extends graphical rendering with haptic rendering. The H3D TM API has open module architecture and therefore can be expanded and modified as well as uses the C++ Standard Template Library. Haptic rendering is performed by OpenHaptics. In the H3D TM, haptic rendering can be easily disabled, allowing H3D TM to be optionally used as an X3D compliant API. The H3D TM API supports graphic and haptic colocation with three types of workbenches. The CHAI library by Conti et. al. [14,15] is a set of graphic and haptic C++ libraries that allow both high level and low level programming of haptic applications. The CHAI is an open source, freely available set of C++ libraries for computer haptics. It supports several commercially available three and six Degrees-Of-Freedom (DOF) haptic devices, and makes it simple to support new custom force feedback devices. It is especially suitable for education and research purposes, offering a light platform on which extensions can be developed. It also supports developer-friendly creation of multimodal virtual worlds by tightly integrating the haptic and visual representations of objects. The I-Touch haptic frame work by Pocheville et. al. [16] has been designed to provide an open architecture tools. Its purpose is academic and concerns the conception of a generic framework that is able to allow researchers in haptics to prototype quickly their application. Since this framework is designed based on modular structure, no haptic graph-scene is defined and haptics is directly from the dynamic simulation engine which computes contact force. Thus it supports software flexibility that allows creating application with haptic in a simple manner. The I-Touch is especially applicative and concerns a priori virtual prototyping with haptic feedback in industry. The Haptik by Pascale et. al. [17] is a component based SDK that guarantees a binary compatibility of applications with future version of devices and plug-ins. The Haptik library is an open source library with a component based architecture that acts as a Hardware Abstraction Layer to provide uniform access to haptic devices. It does not contain graphic primitives, physics related haptic algorithms or complex class hierarchies, but instead exposes a set of interfaces that hide differences between devices to the applications. The Haptik has a very simple API. Haptik Library is easy to adopt even in already existing projects because it has a simple and absolutely non-invasive API. Current haptic SDKs and APIs have been developed for creation of haptic applications easily. Nevertheless, each SDK or API has some limitations. For example, haptic rendering algorithms implemented in current SDKs and APIs support surface-based models. Thus they cannot support directly other data representation such as volume and height field (2.5D depth image) datasets without converting them into polygonal meshes. In addition, the rapid development of computer graphics domain has made it possible to provide virtual world with hybrid environments. Hybrid environments can contain diverse objects with various data representations at the same virtual world. Current haptic SDK or API, however, cannot support haptic interaction with hybrid environments because surface-based haptic algorithm cannot cover various datasets without preprocessing. In order to allow haptic interaction with diverse data representations by using existing haptic SDKs and APIs, object data should be needed additional efforts like unification and reconstruction of the hierarchical structure in real-time or different types of haptic rendering algorithm should be selectively operated with respect to data representation. For the former case, there are real-time issues of data conversion for large amount of datasets and dynamically changing or loaded objects. The latter case, deterioration of haptic update rate can be occurred especially for hybrid environments. Furthermore, these haptic tools are generally limited only to kinesthetic feedback. Even though tactile sensation in virtual environments also plays an important role in recognizing surface properties such as roughness and texture, any functionalities or libraries in the architecture of existing SDKs and APIs are not provided for more versatile haptic interaction. This paper presents development of a new haptic rendering API (Application Programming Interface) called K-Touch TM haptic API based on the core haptic rendering algorithm that had been developed by authors [18,19]. It is designed for users to interact with virtual objects by kinesthetic and tactile modalities through haptic interfaces. The K-Touch TM haptic API is a set of haptic and graphic C/C++ classes. Users who are not interested in implementation details of haptic rendering can easily create visual-haptic scenes by using a large set of pre-implemented haptic and graphic algorithms. The architectural design of this API considered the following aspects: 1) supporting hybrid environments with various data representations (polygon, height field (2.5D), volume and image data sets), 2) supporting kinesthetic and tactile haptic interactions simultaneously, 3) efficient haptic and graphic rendering algorithm, 4) convenience of use, and 5) extensible software architecture. This paper is organized as follows: Section 2 describes the concepts and key features of the proposed haptic API. Detailed hierarchical architecture is described in Section 3. Finally, Section 4 discusses proposed haptic API, and lists possible future works on the K-Touch TM haptic API. 2 CONCEPTS AND KEY FEATURES OF K-TOUCH TM API The K-Touch TM haptic API is targeted for general users who are less familiar with haptics technology but desire to quickly and easily add haptics to their application. It can also support developers who are familiar with haptic paradigms. It is also designed to serve various application fields with kinesthetic and tactile modalities together, and to support haptic interaction with various data representations. The following key elements were identified as of fundamental importance: i. Independent of haptic rendering contents or data such as conventional polygon model, volumetric datasets, and 2.5D height field data representation. ii. Simultaneous kinesthetic and tactile interaction. iii. iv. Easy to use in creating a visual and haptic scene. Efficient graphic and haptic rendering processing and software architecture. In order to realize the points above the proposed K- Touch TM haptic API have been designed with the following features. i) As mentioned in the previous section, all existing haptic rendering SDKs and APIs algorithms are based on surfacebased haptic rendering. Therefore current haptic tools are very difficult to support haptic interaction with hybrid environments with volume, height field (2.5D), and image data. The K-TouchTM haptic API is developed for haptic interaction with hybrid environments with various data representation by the core haptic rendering algorithm that is independent of the type of data representation in the haptic scene [18, 19]. Both collision detection and force computation are based on the graphics hardware. In general, graphics hardware is used to render virtual objects in windows coordinate. Thus all of graphical contents or data should be passed through graphics pipeline. Using graphics hardware, data for haptic interaction is acquired with a uniform representation with the LOMI concept [18]. With these characteristics, therefore, the K-TouchTM API is capable of covering various data representation such as conventional 3D

3 models, height field (2.5D) data, volume data and captured 2D image. ii) Most current haptic rendering libraries only allow single point contact interaction through kinesthetic paradigms. The K- TouchTM haptic API, on the contrary, allows users to simultaneously interact with kinesthetic and tactile sensation. A PHANTOM, for instance, can interact with virtual objects not only by kinesthetic interaction but also by tactile display by combining the force-reflection PHANTOM device with external tactile system. The key element to accomplish this is the distinction between the kinesthetic and tactile rendering classes. A kinesthetic rendering class is a contact force level entity on the muscle. A tactile rendering class, on the other hand, is the entity that supports creating surface properties that can be applied to the skin surface directly. iii) The K-TouchTM haptic API is C/C++ based and is developed using Microsoft Visual C++. The architecture of the K-TouchTM defines basic methods for creating and maintaining a scene graph structure that represents the logical representation of graphic and haptic objects in environments. Therefore user can create a haptic application by using a few lines of C/C++ code easily. A sample code is: world=new KTouchWorld(); // Create and initialize haptic scene Viewport=new KTouchViewport(world); // Initialize Viewport for graphic rendering world->setviewport(viewport); // Add to haptic scene graph // Create static object node staticnode = new KTouchStaticObject(world); world->addchild(staticnode); // Add to haptic scene graph // Create static object staticobjpbuff = new KTouchPbuffObj(world); // Load a virtual object from.obj file format staticobjpbuff->createobject("top.obj", "top.bmp"); staticnode ->addchild(staticobjpbuff); // Add to haptic scene graph // Start current haptic device and rendering algorithm logically Tool->start(); Viewport->RenderObjects(); // Render virtual objects logically iv) The K-TouchTM haptic API can be used as a high level tool to easily create a haptic scene. A haptic scene means that specific objects can be rendered haptically as well as graphically. A haptic scene is composed of two main parts: graphical contents and haptic interaction. The base root class of K-TouchTM API can be populated by various virtual objects, haptic device, properties of virtual environments, and haptic rendering algorithms. OpenGL-based graphic rendering of objects and a haptic interaction point is automatically taken care of by the haptic rendering engine. To improve the update rate of the graphic rendering, both conventional and state-of-the-art graphic rendering methods are implemented. Efficient graphic rendering is realized by indexed geometry, display list (using system cash memory), and vertex buffer object (using video Fig. 1 : The K-Touch TM haptic API architecture. memory). Since the proposed API is constructed with an Object-Oriented Programming (OOP) methodology, users can easily create or reuse derivative classes from hierarchical structure of the K-TouchTM haptic API. This API also supports major 3 DOF commercially available haptic devices such as the PHANTOM. In addition, K-TouchTM haptic API allows users to easily control and write drivers for other devices. 3 K-TOUCH TM HAPTIC API ARCHITECTURE The proposed haptic API is created by connecting a chain of class components. The overall architecture of the K-Touch TM haptic API is shown in Fig 1. The KTouchObject class can be populated by various virtual objects. Virtual objects can be classified as a static or dynamic object by its dynamic properties. While the static objects represent backgrounds or three dimensional objects without motion properties, the dynamic objects represent moving objects as well as functional objects with specific properties such as buttons or sliders in a haptic scene. In addition, the haptic scene can be composed of a depth image (2.5D height field datasets) that includes depth image as well as an RGB image. The data representation of depth image is managed in the KTouchImage class. All objects in a haptic scene can be referred to the OpenGL-based graphic rendering and haptic rendering classes. In order to render various objects graphically, OpenGL-based graphic rendering class provides a user with rendering functions to display existing objects and supports virtual environment properties such as camera and light parameters. In case of haptic rendering, both KTouchDepth and KTouchLomi classes are responsible for performing collision detection and calculating forces that should be applied to the user to simulate the desired effect. A virtual object or image data is defined as pixels in the windows coordinates. Each pixel has depth value in addition to its RGB data, which is usually used to cull hidden surfaces and to create an appearance of three dimensions in a perspective plot. The KTouchDepth class acquires the depth information about existing objects. In order to get the depth values of three dimensional objects, six virtual cameras are located around the portion-of-interest. The acquired depth value is used to create and update the LOMI. The LOMI is the entity that contains local geometry information of a touching object corresponding to haptic probe. At the force computation stage, The LOMI is utilized for calculating contact response force [18,19]. When depth information is calculated by the graphics hardware, any object in the haptic scene can be rendered into the frame buffer or pixel buffer. The screen is composed of a rectangular array of pixels, each capable of displaying a tiny square of color at that point in the image. After the rasterization stage in graphic pipeline, the data are not yet pixels, but are fragments. Each fragment has coordinate data which corresponds to a pixel, as well as color and depth values. Then each fragment undergoes a series of tests and operations like blending, anti-aliasing and polygon offset. At the end of graphic pipeline, each object is finally written into the frame buffer and displayed on the screen. Normally, the OpenGL libraries are used for rendering into a window which is displayed on the screen. The depth information is acquired by depth buffer at six sides which is achieved by additional graphic rendering. For the stationary or rigid objects, pixel buffer (KTouchPbuff) is used to store six-side depth image of whole object by the off-line graphic rendering. Note that the pixel buffer rendering is performed only one time before haptic thread begins. For deformable objects, frame buffer

4 (KTouchFbuff) is used to store six-side local depth image of the object by the on-line graphic rendering. Since the frame buffer rendering is performed at every graphic rendering rate, the depth image of whole object is not required. Actually, both collision detection and force computation procedures of a point interaction are performed in the KTouch3dForceAlgo class by referring the KTouchDepth and KTouchLomi class. Tactile rendering is accomplished by the KTouchTactileAlgo class. These two classes are derived from the KTouchForceAlgo class. The haptic rendering scheduler takes charge of a kinesthetic and tactile rendering procedure and guarantees that the haptic rendering loop will have a 1 KHz update/refresh rate. Finally, both calculated force and tactile effects are applied to the user through kinesthetic and tactile devices. 4 APPLICATION EXAMPLES Fig 2 shows preliminary application examples based on the K- TouchTM haptic API. Fig. 2(a) and Fig. 2(b) show that users can touch real letter captured by a web camera and 2.5D depth image captured by the Z-Cam, respectively. Fig. 2(c) shows haptic interaction with 3D photorealistic datasets of polygons more than two millions. In order to support simultaneous kinesthetic and tactile sensations, a vibrotactile device that equipped simple 2 by 2 pin arrays is attached to a kinesthetic device (See Fig. 2(d)). Fig. 2(e) shows haptic game application. Fig. 2 : Application examples by using the K-Touch TM. 5 DISCUSSIONS AND FUTURE WORKS In this paper, the K-Touch TM API is proposed to create haptic interaction applications. Users who are not interested in implementation details of haptic rendering can easily create visual-haptic scenes using a large set of pre-implemented haptic and graphic classes. The proposed K-Touch TM API has the following advantages; firstly, the K-Touch TM API is capable of covering various data representation such as conventional 3D models, height field (2.5D) data, volume data and captured 2D image. That is, this API is independent of data representations, which allows users to interact directly with hybrid environments composed of objects directly. Secondly, architecture of this API is designed for supporting tactile display as well as kinesthetic interaction simultaneously. In order to display haptic texture sensation, tactile display should render a high bandwidth of spatial frequency different from low bandwidth (typically below 30Hz) of force feedback device. Therefore we considered tactile display with kinesthetic interaction to support more versatile haptic interaction. In the K- Touch TM haptic API, a high level software architecture which deals with tactile information is designed for generating tactile rendering signals as well as a low level tactile control algorithm is implemented for controlling tactile actuators. Thirdly, the K-Touch TM haptic API can support large virtual environments by using the LOMI [18] concept as well as stateof-the-art graphic rendering methods. The LOMI is a spatiotemporal occupancy map that represents local geometry shape of graphically rendered object. That is, the LOMI indicates small size of geometry shape corresponding to a portion of interest. Therefore it is not needed to contain whole data of virtual environments. Creation and update of the LOMI structure is independent of data complexity in virtual environments. With efficient graphic rendering and the LOMI concept, the K- Touch TM haptic API can support large virtual environments composed of one million polygons under 1KHz haptic rendering and 25~30Hz graphic rendering update rates, respectively. Finally, with respect to easiness of use, users or researchers who are familiar with C/C++ programming can easily make a haptic scene with few lines of code. It may be also utilized for general users who are not familiar with haptics technology in detail to easily create haptic applications. General users can create haptic applications with preimplemented graphic and haptic functionalities like creation of application program using widely spread C languages. With various advantages of the K-Touch TM haptic API, various applications providing kinesthetic and tactile interactions can be created conveniently. The next phase of this work is to implement more advanced functionalities. The important considerations of designing a haptic API are easy contents generation and distribution. It means that a scene description framework for touchable contents, usually virtual environments, is needed. The Reachin[10], GHOST[9] and OpenHaptics[9] adopt VRML and the H3D[13] is based on X3D. However, VRML and the X3D don't consider streaming its data but just adopt download-and-play concept. Thus the user should wait for downloading the contents and then enjoy them. In recent days, since the multimedia available in the Internet is getting diverse and high quality, the scene description framework should be able to extend its data type to deal with and stream high quality, large amount data for the users not to wait for downloading. We are considering BIFS (BInary Format for Scene) in MPEG-4 standards that deal with various media objects and support streaming data for each media object. By adopting the MPEG-4 framework in our API, we are planning to apply the haptic concept to broadcasting. Furthermore, to provide realistic haptic interaction, elaborate graphic and haptic co-location system will be performed in the near future. To support different types of commercialized haptic device, other device classes will be implemented in the architecture of this API. Other important consideration in the proposed haptic API is to include material properties such as friction coefficient by separately assigning different coefficients of friction to different objects. For this purpose, Haptic User Interfaces (HUI) will be implemented by using the proposed haptic API. HUI will support more convenient haptic modeling operations such as edition, modification and creation of the haptic scene in order to support more intuitive haptic interaction. AKNOWLDEGMENT This work was supported in part by the Ministry of Information and Communication through the next generation PC project and the Realistic Broadcasting IT Research Center (RBRC) at GIST and by Ministry of Science and Technology (MOST) through the Immersive Contents Research Center (ICRC) at GIST.

5 REFERENCES [1] K. Salisbury, F. Barbagli, and F. Conti, Haptic Rendering: Introductory Concepts, IEEE Computer Graphics and Applications, vol. 24, no. 2, pp , [2] C. Zilles, K. Salisbury, A constraint-based god-object method for haptic display, IEE/RSJ Int. Conf. Proc. Intelligent Robotics and System, vol 3, pp , [3] D.C. Ruspini, K. Kolarov, and O. Khatib., The Haptic Display of Complex Graphical Environment, Conf. Proc. ACM SIGGRAPH, vol 1, pp , [4] K. Salisbury and C. Tarr, Haptic Rendering of Surfaces Defined by Implicit Functions, Proc. ASME. Dynamic Systems and Control Division, vol. 61, pp , [5] S. Walker and K. Salisbury Large Haptic Topographic maps: Marsview and the Proxy Graph Algorithm, Proc. ACM Symp. Interactive 3D graphics, pp , [6] W. McNeely, K. Puterbaugh, and J. Troy, Six Degreeof-Freedom Haptic Rendering using Voxel Sampling, Proc. ACM SIGGRAPH, pp , [7] D.A. Lawrence, C.D. Lee and L.Y. Pao, Shock and Votex visualization Using a Combined Visual/Haptic Interface, J. IEEE Visualization, [8] A. Prior, K. Hanies, The use of a proximity agent in a collaborative virtual environment with 6 degrees-of-freedom voxel-based haptic rendering, WHC 2005, pp , [9] SensAble Technologies Inc., [10] ReachIn, [11] Force Dimension Inc., [12] Novint, [13] SenseGraphics Inc., [14] CHAI 3D, [15] Conti F, Barbagli F, Morris D, Sewell C, CHAI: An Open-Source Library for the Rapid Development of Haptic Scene, IEEE World Haptics, Demo presented, [16] A. Pocheville and A. Kheddar, I-TOUCH: A framework for computer haptics, Int. Conf. Prc. Intelligent Robots and Systems (IROS), [17] M. de Pascale, G. de Pascale, D. Prattichizzo, and F. Barbagli, The Haptik Library - a Component based Architecture for Haptic Devices Access, Eurohaptic 2004, Poster, [18] J.P. Kim and J. Ryu, Hardware Based 2.5D Haptic Rendering Algorithm using Localized Occupancy Map Instance, Int. Conf. Proc. Artificial Reality and Telexistence (ICAT), pp , [19] J.P. Kim, B.C. Lee, and J. Ryu, Haptic Rendering with Six Virtual Cameras, HCI international 2005, pp , 2005.

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA

More information

Overview of current developments in haptic APIs

Overview of current developments in haptic APIs Central European Seminar on Computer Graphics for students, 2011 AUTHOR: Petr Kadleček SUPERVISOR: Petr Kmoch Overview of current developments in haptic APIs Presentation Haptics Haptic programming Haptic

More information

PROPRIOCEPTION AND FORCE FEEDBACK

PROPRIOCEPTION AND FORCE FEEDBACK PROPRIOCEPTION AND FORCE FEEDBACK Roope Raisamo and Jukka Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere,

More information

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Andrew A. Stanley Stanford University Department of Mechanical Engineering astan@stanford.edu Alice X. Wu Stanford

More information

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY MARCH 4, 2012 HAPTICS SYMPOSIUM Overview A brief introduction to CS 277 @ Stanford Core topics in haptic rendering Use of the CHAI3D framework

More information

FORCE FEEDBACK. Roope Raisamo

FORCE FEEDBACK. Roope Raisamo FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces

More information

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

Phantom-X. Unnur Gretarsdottir, Federico Barbagli and Kenneth Salisbury

Phantom-X. Unnur Gretarsdottir, Federico Barbagli and Kenneth Salisbury Phantom-X Unnur Gretarsdottir, Federico Barbagli and Kenneth Salisbury Computer Science Department, Stanford University, Stanford CA 94305, USA, [ unnurg, barbagli, jks ] @stanford.edu Abstract. This paper

More information

HAMLAT: A HAML-based Authoring Tool for Haptic Application Development

HAMLAT: A HAML-based Authoring Tool for Haptic Application Development HAMLAT: A HAML-based Authoring Tool for Haptic Application Development Mohamad Eid 1, Sheldon Andrews 2, Atif Alamri 1, and Abdulmotaleb El Saddik 2 Multimedia Communications Research Laboratory (MCRLab)

More information

Force feedback interfaces & applications

Force feedback interfaces & applications Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,

More information

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptic presentation of 3D objects in virtual reality for the visually disabled Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl,

More information

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics CS277 - Experimental Haptics Lecture 1 Introduction to Haptics Haptic Interfaces Enables physical interaction with virtual objects Haptic Rendering Potential Fields Polygonal Meshes Implicit Surfaces Volumetric

More information

A Movement Based Method for Haptic Interaction

A Movement Based Method for Haptic Interaction Spring 2014 Haptics Class Project Paper presented at the University of South Florida, April 30, 2014 A Movement Based Method for Haptic Interaction Matthew Clevenger Abstract An abundance of haptic rendering

More information

Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design

Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design S. Wannarumon Kielarova Department of Industrial Engineering, Naresuan University, Phitsanulok 65000 * Corresponding Author

More information

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Anders J Johansson, Joakim Linde Teiresias Research Group (www.bigfoot.com/~teiresias) Abstract Force feedback (FF) is a technology

More information

PhysX-based Framework for Developing Games with Haptic Feedback

PhysX-based Framework for Developing Games with Haptic Feedback PhysX-based Framework for Developing Games with Haptic Feedback R.P.C. Janaka Rajapakse* Yoshimasa Tokuyama** and Kouichi Konno*** Tainan National University of the Arts*, Tokyo Polytechnic University**,

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Haptic Rendering and Volumetric Visualization with SenSitus

Haptic Rendering and Volumetric Visualization with SenSitus Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,

More information

MPEG-V Based Web Haptic Authoring Tool

MPEG-V Based Web Haptic Authoring Tool MPEG-V Based Web Haptic Authoring Tool by Yu Gao Thesis submitted to the Faculty of Graduate and Postdoctoral Studies In partial fulfillment of the requirements For the M.A.Sc degree in Electrical and

More information

Abstract. 1. Introduction

Abstract. 1. Introduction GRAPHICAL AND HAPTIC INTERACTION WITH LARGE 3D COMPRESSED OBJECTS Krasimir Kolarov Interval Research Corp., 1801-C Page Mill Road, Palo Alto, CA 94304 Kolarov@interval.com Abstract The use of force feedback

More information

2. Introduction to Computer Haptics

2. Introduction to Computer Haptics 2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer

More information

Computer Haptics and Applications

Computer Haptics and Applications Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School

More information

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

CS277 - Experimental Haptics Lecture 2. Haptic Rendering CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...

More information

MHaptic : a Haptic Manipulation Library for Generic Virtual Environments

MHaptic : a Haptic Manipulation Library for Generic Virtual Environments MHaptic : a Haptic Manipulation Library for Generic Virtual Environments Renaud Ott, Vincent De Perrot, Daniel Thalmann and Frédéric Vexo Virtual Reality Laboratory (VRLab) École Polytechnique Fédérale

More information

Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction

Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction Ikumi Susa Makoto Sato Shoichi Hasegawa Tokyo Institute of Technology ABSTRACT In this paper, we propose a technique for a high quality

More information

Haptic Data Transmission based on the Prediction and Compression

Haptic Data Transmission based on the Prediction and Compression Haptic Data Transmission based on the Prediction and Compression 375 19 X Haptic Data Transmission based on the Prediction and Compression Yonghee You and Mee Young Sung Department of Computer Science

More information

Touching and Walking: Issues in Haptic Interface

Touching and Walking: Issues in Haptic Interface Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Haptics CS327A

Haptics CS327A Haptics CS327A - 217 hap tic adjective relating to the sense of touch or to the perception and manipulation of objects using the senses of touch and proprioception 1 2 Slave Master 3 Courtesy of Walischmiller

More information

Haptic Rendering CPSC / Sonny Chan University of Calgary

Haptic Rendering CPSC / Sonny Chan University of Calgary Haptic Rendering CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS Jaejoon Kim, S. Mandayam, S. Udpa, W. Lord, and L. Udpa Department of Electrical and Computer Engineering Iowa State University Ames, Iowa 500

More information

Benefits of using haptic devices in textile architecture

Benefits of using haptic devices in textile architecture 28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a

More information

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Modeling and Experimental Studies of a Novel 6DOF Haptic Device Proceedings of The Canadian Society for Mechanical Engineering Forum 2010 CSME FORUM 2010 June 7-9, 2010, Victoria, British Columbia, Canada Modeling and Experimental Studies of a Novel DOF Haptic Device

More information

A Novel Test-Bed for Immersive and Interactive Broadcasting Production Using Augmented Reality and Haptics

A Novel Test-Bed for Immersive and Interactive Broadcasting Production Using Augmented Reality and Haptics 106 LETTER Special Section on Artificial Reality and Telexistence A Novel Test-Bed for Immersive and Interactive Broadcasting Production Using Augmented Reality and Haptics Seungjun KIM a), Student Member,

More information

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery Claudio Pacchierotti Domenico Prattichizzo Katherine J. Kuchenbecker Motivation Despite its expected clinical

More information

Haptic Rendering of Large-Scale VEs

Haptic Rendering of Large-Scale VEs Haptic Rendering of Large-Scale VEs Dr. Mashhuda Glencross and Prof. Roger Hubbold Manchester University (UK) EPSRC Grant: GR/S23087/0 Perceiving the Sense of Touch Important considerations: Burdea: Haptic

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

Intelligent Modelling of Virtual Worlds Using Domain Ontologies

Intelligent Modelling of Virtual Worlds Using Domain Ontologies Intelligent Modelling of Virtual Worlds Using Domain Ontologies Wesley Bille, Bram Pellens, Frederic Kleinermann, and Olga De Troyer Research Group WISE, Department of Computer Science, Vrije Universiteit

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun

The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seungmoon Choi and In Lee Haptics and Virtual Reality Laboratory

More information

Cody Narber, M.S. Department of Computer Science, George Mason University

Cody Narber, M.S. Department of Computer Science, George Mason University Cody Narber, M.S. cnarber@gmu.edu Department of Computer Science, George Mason University Lynn Gerber, MD Professor, College of Health and Human Services Director, Center for the Study of Chronic Illness

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Exploring Surround Haptics Displays

Exploring Surround Haptics Displays Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,

More information

Haplug: A Haptic Plug for Dynamic VR Interactions

Haplug: A Haptic Plug for Dynamic VR Interactions Haplug: A Haptic Plug for Dynamic VR Interactions Nobuhisa Hanamitsu *, Ali Israr Disney Research, USA nobuhisa.hanamitsu@disneyresearch.com Abstract. We demonstrate applications of a new actuator, the

More information

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration 22 ISSN 2043-0167 Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration Oussama Metatla, Fiore Martin, Nick Bryan-Kinns and Tony Stockman EECSRR-12-03 June

More information

Creating a Multimodal 3D Virtual Environment. Johannes Pystynen

Creating a Multimodal 3D Virtual Environment. Johannes Pystynen Creating a Multimodal 3D Virtual Environment Johannes Pystynen University of Tampere School of Information Sciences Interactive Technology M.Sc. Thesis Supervisor: Roope Raisamo 30.12.2011 University of

More information

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Beyond Visual: Shape, Haptics and Actuation in 3D UI Beyond Visual: Shape, Haptics and Actuation in 3D UI Ivan Poupyrev Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for

More information

Haptic Sensing and Perception for Telerobotic Manipulation

Haptic Sensing and Perception for Telerobotic Manipulation Haptic Sensing and Perception for Telerobotic Manipulation Emil M. Petriu, Dr. Eng., P.Eng., FIEEE Professor School of Information Technology and Engineering University of Ottawa Ottawa, ON., K1N 6N5 Canada

More information

VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing

VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing www.dlr.de Chart 1 > VR-OOS System Architecture > Robin Wolff VR-OOS Workshop 09/10.10.2012 VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing Robin Wolff DLR, and

More information

College Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we

College Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we Continuously-Adaptive Haptic Rendering Jihad El-Sana 1 and Amitabh Varshney 2 1 Department of Computer Science, Ben-Gurion University, Beer-Sheva, 84105, Israel jihad@cs.bgu.ac.il 2 Department of Computer

More information

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Using Real Objects for Interaction Tasks in Immersive Virtual Environments Using Objects for Interaction Tasks in Immersive Virtual Environments Andy Boud, Dr. VR Solutions Pty. Ltd. andyb@vrsolutions.com.au Abstract. The use of immersive virtual environments for industrial applications

More information

An Introduction into Virtual Reality Environments. Stefan Seipel

An Introduction into Virtual Reality Environments. Stefan Seipel An Introduction into Virtual Reality Environments Stefan Seipel stefan.seipel@hig.se What is Virtual Reality? Technically defined: VR is a medium in terms of a collection of technical hardware (similar

More information

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic

More information

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel stefan.seipel@hig.se VR is a medium in terms of a collection of technical hardware (similar

More information

Comparative Study of APIs and Frameworks for Haptic Application Development

Comparative Study of APIs and Frameworks for Haptic Application Development Comparative Study of APIs and Frameworks for Haptic Application Development Dorin M. Popovici, Felix G. Hamza-Lup, Adrian Seitan, Crenguta M. Bogdan Mathematics and Computer Science Department Ovidius

More information

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience , pp.150-156 http://dx.doi.org/10.14257/astl.2016.140.29 Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience Jaeho Ryu 1, Minsuk

More information

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML a a b Hyungjeen Choi, Jeha Ryu, and Chansu Lee a Human Machine Computer Interface Lab, Kwangju Institute of Science and Technology, Kwangju,

More information

Haptics-Augmented Physics Simulation: Coriolis Effect

Haptics-Augmented Physics Simulation: Coriolis Effect Haptics-Augmented Physics Simulation: Coriolis Effect Felix G. Hamza-Lup, Benjamin Page Computer Science and Information Technology Armstrong Atlantic State University Savannah, GA 31419, USA E-mail: felix.hamza-lup@armstrong.edu

More information

Friction & Workspaces

Friction & Workspaces Friction & Workspaces CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Agenda Rendering surfaces with friction Exploring large virtual environments using devices with limited workspace [From

More information

Haptic Rendering: Introductory Concepts

Haptic Rendering: Introductory Concepts Rendering: Introductory Concepts Human operator Video and audio device Audio-visual rendering rendering Kenneth Salisbury and Francois Conti Stanford University Federico Barbagli Stanford University and

More information

Reproduction of Human Manipulation Skills in a Robot

Reproduction of Human Manipulation Skills in a Robot University of Wollongong Research Online Faculty of Engineering - Papers (Archive) Faculty of Engineering and Information Sciences 2005 Reproduction of Human Manipulation Skills in a Robot Shen Dong University

More information

INTRODUCTION TO GAME AI

INTRODUCTION TO GAME AI CS 387: GAME AI INTRODUCTION TO GAME AI 3/31/2016 Instructor: Santiago Ontañón santi@cs.drexel.edu Class website: https://www.cs.drexel.edu/~santi/teaching/2016/cs387/intro.html Outline Game Engines Perception

More information

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT Introduction to Game Design Truong Tuan Anh CSE-HCMUT Games Games are actually complex applications: interactive real-time simulations of complicated worlds multiple agents and interactions game entities

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y New Work Item Proposal: A Standard Reference Model for Generic MAR Systems ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y What is a Reference Model? A reference model (for a given

More information

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017 23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS Sergii Bykov Technical Lead Machine Learning 12 Oct 2017 Product Vision Company Introduction Apostera GmbH with headquarter in Munich, was

More information

Networked Virtual Environments

Networked Virtual Environments etworked Virtual Environments Christos Bouras Eri Giannaka Thrasyvoulos Tsiatsos Introduction The inherent need of humans to communicate acted as the moving force for the formation, expansion and wide

More information

3D interaction techniques in Virtual Reality Applications for Engineering Education

3D interaction techniques in Virtual Reality Applications for Engineering Education 3D interaction techniques in Virtual Reality Applications for Engineering Education Cristian Dudulean 1, Ionel Stareţu 2 (1) Industrial Highschool Rosenau, Romania E-mail: duduleanc@yahoo.com (2) Transylvania

More information

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1 Preprints of IAD' 2007: IFAC WORKSHOP ON INTELLIGENT ASSEMBLY AND DISASSEMBLY May 23-25 2007, Alicante, Spain HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS

More information

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Graphical User Interfaces for Blind Users: An Overview of Haptic Devices Hasti Seifi, CPSC554m: Assignment 1 Abstract Graphical user interfaces greatly enhanced usability of computer systems over older

More information

Designing Semantic Virtual Reality Applications

Designing Semantic Virtual Reality Applications Designing Semantic Virtual Reality Applications F. Kleinermann, O. De Troyer, H. Mansouri, R. Romero, B. Pellens, W. Bille WISE Research group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium

More information

Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System

Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System Design of the ImmersiveTouch : a High-Performance Haptic Augmented Virtual Reality System Cristian Luciano, Pat Banerjee, Lucian Florea, Greg Dawe Electronic Visualization Laboratory Industrial Virtual

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel, MDI Inst. f. Informationsteknologi stefan.seipel@hci.uu.se VR is a medium in terms of a collection

More information

Indiana K-12 Computer Science Standards

Indiana K-12 Computer Science Standards Indiana K-12 Computer Science Standards What is Computer Science? Computer science is the study of computers and algorithmic processes, including their principles, their hardware and software designs,

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 3,800 116,000 120M Open access books available International authors and editors Downloads Our

More information

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde

More information

The presentation based on AR technologies

The presentation based on AR technologies Building Virtual and Augmented Reality Museum Exhibitions Web3D '04 M09051 선정욱 2009. 05. 13 Abstract Museums to build and manage Virtual and Augmented Reality exhibitions 3D models of artifacts is presented

More information

Diploma Thesis. Adding Haptic Feedback to Geodesy Analysis Tools used in Planetary Surface Exploration. April 22, 2014

Diploma Thesis. Adding Haptic Feedback to Geodesy Analysis Tools used in Planetary Surface Exploration. April 22, 2014 Otto-von-Guericke-University Magdeburg Faculty of Computer Science Dep. of Simulation and Graphics German Aerospace Center Braunschweig Institute of Simulation and Software Technology Dep. of Software

More information

CIS Honours Minor Thesis. Research Proposal Hybrid User Interfaces in Visuo-Haptic Augmented Reality

CIS Honours Minor Thesis. Research Proposal Hybrid User Interfaces in Visuo-Haptic Augmented Reality CIS Honours Minor Thesis Research Proposal Hybrid User Interfaces in Visuo-Haptic Augmented Reality Student: Degree: Supervisor: Ulrich Eck LHIS Dr. Christian Sandor Abstract In 1965, Ivan Sutherland envisioned

More information

IN virtual reality (VR) technology, haptic interface

IN virtual reality (VR) technology, haptic interface 1 Real-time Adaptive Prediction Method for Smooth Haptic Rendering Xiyuan Hou, Olga Sourina, arxiv:1603.06674v1 [cs.hc] 22 Mar 2016 Abstract In this paper, we propose a real-time adaptive prediction method

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Web3D Standards. X3D: Open royalty-free interoperable standard for enterprise 3D

Web3D Standards. X3D: Open royalty-free interoperable standard for enterprise 3D Web3D Standards X3D: Open royalty-free interoperable standard for enterprise 3D ISO/TC 184/SC 4 - WG 16 Meeting - Visualization of CAD data November 8, 2018 Chicago IL Anita Havele, Executive Director

More information

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface Weihang Zhu and Yuan-Shin Lee* Department of Industrial Engineering North Carolina State University,

More information

Peter Berkelman. ACHI/DigitalWorld

Peter Berkelman. ACHI/DigitalWorld Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash

More information

ISO/IEC JTC 1 VR AR for Education

ISO/IEC JTC 1 VR AR for Education ISO/IEC JTC 1 VR AR for January 21-24, 2019 SC24 WG9 & Web3D Meetings, Seoul, Korea Myeong Won Lee (U. of Suwon) Requirements Learning and teaching Basic components for a virtual learning system Basic

More information

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Sound rendering in Interactive Multimodal Systems. Federico Avanzini Sound rendering in Interactive Multimodal Systems Federico Avanzini Background Outline Ecological Acoustics Multimodal perception Auditory visual rendering of egocentric distance Binaural sound Auditory

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

Using Hybrid Reality to Explore Scientific Exploration Scenarios

Using Hybrid Reality to Explore Scientific Exploration Scenarios Using Hybrid Reality to Explore Scientific Exploration Scenarios EVA Technology Workshop 2017 Kelsey Young Exploration Scientist NASA Hybrid Reality Lab - Background Combines real-time photo-realistic

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a International Conference on Education Technology, Management and Humanities Science (ETMHS 2015) The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a 1 School of Art, Henan

More information

Scaling Resolution with the Quadro SVS Platform. Andrew Page Senior Product Manager: SVS & Broadcast Video

Scaling Resolution with the Quadro SVS Platform. Andrew Page Senior Product Manager: SVS & Broadcast Video Scaling Resolution with the Quadro SVS Platform Andrew Page Senior Product Manager: SVS & Broadcast Video It s All About the Detail Scale in physical size and shape to see detail with context See lots

More information

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment Ed Helwig 1, Facundo Del Pin 2 1 Livermore Software Technology Corporation, Livermore CA 2 Livermore Software Technology

More information