Emerging tangible interfaces for facilitating collaborative immersive visualizations

Size: px
Start display at page:

Download "Emerging tangible interfaces for facilitating collaborative immersive visualizations"

Transcription

1 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , Emerging tangible interfaces for facilitating collaborative immersive visualizations Brygg Ullmer, Andrei Hutanu, Werner Benger, and Hans-Christian Hege Konrad-Zuse-Zentrum für Informationstechnik Berlin {ullmer, hutanu, benger, hege}@zib.de Takustrasse 7, Berlin, Germany ABSTRACT We describe work in progress toward a tangible interface for facilitating collaborative visualization within immersive environments. This is based upon a system of pads, cards, and wheels that physically embody key digital operations, data, and parameters. These visualization artifacts provide a simple means for collaboratively engaging with time, space, parameters, and information aggregates, which we believe will generalize over a variety of applications. INTRODUCTION Several recent systems have made compelling use of instrumented, physically representational objects that serve as interface elements within immersive workbenches and CAVEs [Schkolne et al. 2001, Keefe et al. 2001]. These tangibles [Ullmer and Ishii 2001] have taken immersive interaction to new levels by allowing users to spatially multiplex key operations across specialized tools which remain near to hand within a kinesthetic reference frame 1. Applications involving spatial interaction with intrinsically three-dimensional data lend themselves naturally to immersive virtual reality (VR), allowing users to make direct deictic (pointing) reference toward 3D geometrical constructs. However, even applications that center upon inherently geometrical content usually build implicitly or explicitly upon interactions with abstract digital information that has no intrinsic spatial representation. Example tasks include loading and saving data, adjusting parameters, and establishing links with remote collaboration partners. While considerable effort has been invested into 3D graphical interaction with abstract digital content, the area has proven challenging, and few widely accepted methods have emerged. In part, this likely relates to the visual design challenges involved in realizing legible, compelling 3D graphical representations of abstract content. However, we believe that the nature of prior interaction devices used for immersive VR is also a major factor. To consider a related example, in a chapter titled The Invention of the Mouse, speaking with respect to the first application of the tablet-based stylus (during the mid-1960 s, in the GRAIL system), Bardini [2000] writes: it was a design dead end at that point to consider that the same input device could be the device of choice to 1 Kinesthetic reference frames are the workspace and origin within which the hands operate [Balakrishnan and Hinckley 1999] enter both graphics (line drawing) and text (character recognition). Projected into the space of immersive virtual reality, this raises the question of whether 3D pointing devices such as wands, gloves, and more specialized descendants are indeed well-suited to serve as the primary interaction device for both inherently geometrical and abstract content; or whether this pervasive combination could be a design dead end that is holding back progress in immersive environments. In this extended abstract, we introduce work in progress on a tangible interface [Ullmer and Ishii 2001] for collaborative interaction with virtual environments. Our interface is based on a system of physical pads, cards, and wheels that serve as both representations and controls for digital operations, parameters, data sets, computing resources, access credentials, and other computationally-mediated content (Figures 1,2). We believe our approach may offer benefits including parallel two-handed interaction with spatial and abstract content; improved collaborative use; enhanced manipulation of digital parameters; improved migration between desktop and immersive environments; and easier user authentication to grid computing resources. Figure 1: Prospective collaborative use of interaction pads with a stereo display wall. Users hold a 3D pointer in one hand, while manipulating visualization artifacts with their second hand. We begin by discussing our basic interaction approach. We then introduce the particular interaction devices we are developing and examples of their use, as well as considering related work. We also discuss several underlying technical factors, including the user interface implications of grid

2 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , computing issues surrounding access to shared network resources, and conclude with a brief preview of future work. INTERACTION APPROACH Our work aims to create a system of visualization artifacts that facilitate the physical manipulation of abstract digital information. These are initially targeted for use within immersive virtual environments, but we believe their functionality generalizes more broadly. These visualization artifacts support a series of simple, task-specific manipulations of online information (e.g., remote datasets), in the absence of general-purpose pointing devices and alphabetic keyboards. Our interface is based upon three kinds of physical/digital objects: pads, cards, and wheels (Figures 2-9). Interaction pads are small, modular objects with specialized work surfaces that support specific digital operations. These pads are used together with data cards and parameter wheels. These are physical tokens that are used as physical representations and controls for online data and parameters. Figure 2: Close-up of visualization artifacts. A floor-standing interaction stand supports and organizes a set of pads, cards, and wheels. Some of these objects are passively stored (on the left and right of the stand), while others are actively used (in the stand s center). Here, one of each of the four core interaction pads is placed upon the stand s central workspace. A data card is placed on the binding pad, while two parameter wheels are placed on the parameter pad (their color is from LED backlighting). These visualization artifacts provide a small subset of the functionality available on traditional desktop graphical interfaces. We assume that many interactions with the information underlying the immersive display will continue to be conducted in traditional desktop-based 2D GUIs. Our system will provide a GUI-based means for easily binding selected digital information onto physical tokens, building on the monitor slot approach of [Ullmer et al. 1998]. Users may then simply access and manipulate this content on interaction pads, while their visual focus remains directed toward the main object of interest. In immersive virtual environments, this will often be some form of 3D graphical visualization. In collaborative usage contexts, the object of interest may also be collocated or remote people. We expect that within immersive environments, users may continue to use both generic and specialized 3D pointing devices as the primary tools for spatially manipulating 3D graphical content. Toward this, our interface allows users to hold a 3D tracker in one hand, while using the second hand to engage with the abstract digital information and operations represented by our visualization artifacts. We imagine that truly simultaneous manipulation of both the tracker and visualization artifacts may be infrequent. Nonetheless, we believe this two-handed approach will help minimize the set-up and tear-down time and distraction of acquiring the tracking device, switching software modes, retargeting to a new spatial area of interest, and continuing interaction that have been necessary in previous approaches for moving between many spatial and abstract operations. We believe that perhaps the foremost value of our interface will be enabling users to easily perform tasks like loading and saving data, establishing video links, manipulating simulation parameters, and controlling presentations while their attention is focused upon the object of interest. Implicit is the belief that the thread of engagement with 3D graphical visualizations and collaborators are generally the main objects of interest, and not secondary (albeit important) interface tasks. We believe our interface s simple controls and physical legibility may be less cognitively demanding than graphical interfaces (especially in the context of immersive environments), and could support VR use by a broader ranger of users. We hope that our visualization artifacts, used together in two-handed interaction with 3D pointing devices, will provide powerful tools for facilitating this balance between functionality and attentional cost. INTERACTION DEVICES Our tangible interface is based upon three kinds of physical objects: pads, cards, and wheels. Interaction pads are modular elements used for operations like loading and saving data, establishing video links, manipulating simulation parameters, and controlling presentations. In their initial incarnation, they are embodied as a series of rectangular modules, each roughly the size of a VHS cassette. They include embedded RFID readers for sensing tagged physical tokens, and communicate with wired and wireless Ethernet. We are currently developing four core interaction pads: - the binding pad: for establishing and accessing data card bindings; - the placement pad: for spatially arranging data card contents on graphical displays; - the parameter pad: for binding and manipulating digital parameters using parameter wheels; and - the control pad: for navigating through media collections, streams, and temporal data (e.g., images, video, simulation time steps, etc.) Taken together, these pads will provide simple means for physically engaging with time, space, parameters, and information aggregates, which we believe will generalize over a variety of applications. These interaction pads are used together with data cards and parameter wheels. Data cards may be used to represent

3 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , content such as data sets, simulation parameters, slide presentations, and live portals (e.g., video conference sessions). Parameter wheels will be bindable to different parameters, and used as kinds of reconfigurable dial box knobs to adjust and control simulation and visualization parameters. Figures 1 and 2 illustrate one prospective usage example. In addition to the pads, cards, and users, these images depict several other details. First, they illustrate an Immersa- Desk style large format stereo display where immersive visualizations are displayed. Secondly, the users wear stereo glasses and use a 3D tracker in one hand, while interacting with visualization artifacts with their second hand. As interactive use of wall displays is frequently conducted while sitting or standing immediately adjacent to the display, we have designed furniture in the form of an interaction stand for physically supporting and organizing the visualization artifacts. The stand also helps provide power and Ethernet connectivity to the interaction pads. We next consider the structure and function of our cards, wheels, and pads in more detail. Data cards Data cards RFID-tagged cards with the size and feel of a credit card (Figure 3) are the primary medium for representing digital information within our system. These cards are each marked with six numbered, (optionally) labeled rows. Each of these rows can be associated with one or more elements of online (URL/URN-referenced) information. One or more of these rows can be selected with the binding pad as the active binding of the card container. This approach works to balance the benefits of physical embodiment and the legibility of visually labeled contents, while combating the flood of objects associated with traditional one object, one binding TUI approaches. Figure 3: Example data cards Data cards are also color-coded and labeled with text (and perhaps visual icons) on their upper corner surfaces. This is intended to allow the cards to be rapidly sorted, and identified while held in a hand in a fashion similar to traditional playing cards [Parlett 1999]. The RFID tags within these cards each hold a unique serial number, as well as several hundred bytes of non-volatile RAM. The non-volatile RAM is used to hold the network address for a SQL database, which is used to hold the actual URLs and authentication information associated with data cards, as well as additional cryptographic information. Parameter wheels Parameter wheels are an approach for expressing discrete and continuous parameters using small cylindrical wheels [Ullmer et al. 2003] (Figure 4). These wheels are used in fashions resembling the dials of dial boxes. In addition to the strengths of dial boxes, parameter wheels can be dynamically bound to different digital parameters. The physical/digital constraints within which parameter wheels are used may also be bound to different digital interpretations, significantly increasing the expressive power of this approach. E.g., in Figure 4, the two left wheel constraints are bound to the y and x axes of a scatterplot visualization. By placing the wheel onto the right x constraint, the user both queries the database for the wheel s parameter, and plots the results of this query along the scatterplot s x axis. Wheel rotation then allows manipulation of the wheels associated parameter values. Figure 4: Example parameter wheels (from [Ullmer et al. 2003]) Parameter wheels can be bound to desired parameters using special data cards. Manipulation of these wheels on the parameter pad can then be used to manipulate simulation parameters (e.g., time) as well as visualization parameters (e.g., transparency ). Interaction pads Interaction pads represent specific digital operations. Their individual work surfaces contain RFID sensing regions, buttons, and displays for sensing and mediating interactions with data cards and parameter wheels. Each interaction pad has several shared features. Each has a work surface of roughly 16.5x10.5cm, and a depth of 4cm (this will be reduced in future iterations). Each also has four indicator LEDs, two of which have associated buttons (Figure 5). These include: - The target LED indicates that the interaction pad is bound to a specific visual display surface (e.g., an immersive wall or computer screen). By pressing the adjoining button, the target binding may be cleared. - The authorization LED indicates that the interaction pad has been securely authenticated with one or more users authentication credentials. Especially within collaboration contexts, this security mechanism plays an important role in (e.g.) allowing users to access or save potentially sensitive remote datasets. Pad authori-

4 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , zation may be revoked with the adjoining button. - The sensing LED indicates that one or more of the embedded RFID sensors is successfully detecting an RFID-tagged parameter card or data wheel. This helps users confirm that the device is functioning properly. - The network LED indicates that the pad is successfully maintaining a network link with its remote proxying computer. The actual visualization operations mediated by interaction pads are proxied by remote computers. This LED confirms that the network link is active. Figure 5: Interaction pad status indicators and controls We next will briefly describe the functions of our current core interaction pads. 1. Binding pad Binding, the process by which digital information is associated with physical objects, is a central activity within tangible interfaces [Cohen et al. 1999]. The binding pad is the principle visualization artifact used to express this assignment. It can be used in several different ways: 1) Selecting elements from the list of bindings contained within datacards; and 2) Copying information between physical tokens; 3) Making bindings to other interaction pads. The binding pad is composed of special constraints (or cells ) for a source and target data card, and a series of selection buttons that can be used to select particular data card rows (Figure 6). Pad interactions begin by placing a data card onto the source or target cells. If a particular row of the card has previously been selected, it will be illuminated by side-facing LEDs. A selection button is located next to each row of the data card. For the source cell, pushing one (or more) button(s) selects the corresponding row, which is again indicated by edge-illumination. This selection is maintained until explicitly changed, even if the card is moved to a different pad. Here, pressing a selection button on the target pad will copy any selected links from the source card into the specified target card row. In addition, we are designing special data cards that have different behaviors. For example, we are developing data cards that refer to different output devices (e.g., projectors and screens). Here, transferring data into such a row has the effect of displaying this data on the selected device. 2. Placement pad The placement pad is used to place one or more information elements onto different regions of a graphical display (e.g., a monitor or projection screen). The information to be displayed is usually expressed by data cards. Examples data includes 3D datasets, slides, and live video streams. Placement is expressed by the card's location within the pad's five cells (four corner positions and one central position). Multiple datacards may be present simultaneously on the pad. Figure 7: Placement pad (CAD layout, faceplate view) The placement pad can be associated with a number of different display devices. These destinations can be specified with the binding pad, or directly in conjunction with tagged output devices. Multiple placement pads can also be bound to the same display device, which we believe will hold special value in collaboration contexts. 3. Parameter pad The parameter pad builds on the parameter wheel concept discussed earlier in the paper. To modify a parameter, the associated wheel is placed onto a cell within the parameter pad. When the wheels is turned, the parameter is modified accordingly, with its new value and consequence displayed on the shared screen. An LCD display may also be used within the pad. We are also considering force-feedback, which might have special value for high latency operations (e.g., steering computationally-intensive simulations). Figure 6: Binding pad (CAD layout, faceplate view) The target cell s behavior depends upon which data cards are present on the binding pad. In the simplest case, a normal data card is present in both source and target cells. Figure 8: Parameter pad (CAD layout, faceplate view)

5 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , Control pad Time-based navigation is an important component of many interactions. For example, in a slide presentation, it is very valuable to have a simple, rapid means to jump to next or previous slides, or to be able to jump with random access to different parts of a presentation. Video playback shares similar benefits. These kinds of controls are also useful for manipulating simulation parameters such as time. The control pad is intended to simplify these kinds of interactions. It incorporates media player-style forward, back, fast forward/back, and play/pause buttons; and a slider for jumping to absolute positions. We expect this pad will be most frequently used for presentations, media browsing, and similar operations. We are also considering allowing the pad s controls to operate upon arbitrary parameters. Figure 9: Control pad (CAD layout, faceplate view) RELATED WORK Our visualization artifacts integrate and build upon a number of interaction techniques introduced in previous papers, both by ourselves and others. We feel this is allowing us to leverage, distill, and combine some of the strongest results of prior work into a single integrated system. Given limited space, we restrict related work discussion to these systems. First, our use of symbolic physical objects as dynamically rebindable containers for aggregates of abstract digital information builds upon the mediablocks system [Ullmer et al. 1998]. We have used cards rather than blocks, as well as multiple bindings per token in an effort to improve scalability and usage pragmatics. The parameter pad and wheels draw from [Ullmer et al. 2003]. Also, the pad concept builds upon the DataTiles approach of Rekimoto, Ullmer, and Oba [2001]. The Paper Palette [Nelson et al. 1999] also made use of card objects as data representations within a tangible interface. Our work differs both in the way datacards are used to represent digital information aggregates, and the way our cards are composed with pads (which represent operations). Finally, the ToonTown interface [Singer et al. 1999] demonstrated a compelling example of physical objects serving as representations of remote people in an audio conferencing application. Similarly, we are using data cards to represent remote participants for (e.g.) video conferencing. EXAMPLE INTERACTIONS The visualization artifacts are still under development, and a detailed discussion of possible interactions goes beyond the scope of this paper. However, it is useful to mention several examples of intended usage contexts. First, we have completed an early working prototype of the binding pad, and successfully used this to interactively present a number of stereo visualizations during an open house. The Amira visualization software was used to load ~15 visualizations of colliding black holes, which were mapped across four data cards. Some visualizations were stereo movies, while others allowed interaction with a 3D tracker or mouse. In this context, the binding pad was valuable in allowing rapid on-demand access and activation to a number of visualizations without consuming display real estate, and with a minimum of attentional requirements for the presenter. We expect the use of visualization artifacts to deliver demonstrations and presentations will be a frequent application, especially once control and parameter pads availability allows increased interactivity with contents. A short summary of these and other promising applications includes: Presentations and demonstrations (with binding, control, parameter, and placement pads) Displaying, browsing, and manipulating media (with binding, control, and placement pads) Parameter studies (with parameter and binding pads) Loading data, saving visualization snapshots (with the binding pad) Video conferencing and other remote collaboration (with binding and placement pads) OTHER COLLABORATION ISSUES Throughout the paper, we have mentioned some of the ways that we are working to support collaborative use. In the following paragraphs, we will consider a few additional collaboration issues that impact our design. First, from a technical standpoint, collaborative use of our interface increases the importance of security mechanisms for linking our interaction devices with their associated data and operations. While these technical issues have received little attention in previous work with tangible interfaces (and perhaps virtual reality systems as well), we believe they are broadly important for any interaction environment that supports multiple users and/or networked content. This is especially true for collaborative interfaces that cause information to be saved or modified. For example, capabilities for saving data and steering remote supercomputing simulations are important for our users. Especially when multiple users collaboratively manipulate data that is hosted at different locales with different access rights, infrastructure and interfaces for managing user authentication and secure operations are essential. We are linking our interface with grid computing infrastructure under development by the EC GridLab project [Allen 2003]. For example, we are developing special datacards that represent user credentials. These cards represent up to six different credentials. Internally, they store both a unique ID and (behind the tag s hardware cryptographic protection) a decryption key that can be used to reconstruct a valid credential. When a credential card is placed onto an inter-

6 Extended abstract presented at the NSF Lake Tahoe Workshop for Collaborative Virtual Reality and Visualization, Oct , action pad and the networked authorization dialogue is successfully completed, the pad s auth LED will light. Future read and write accesses initiated by (or more precisely, on behalf of) the pad will use these access rights. In this way, different interaction pads within the same immersive environment potentially can have different credentials, allowing users (perhaps from competing organizations) to interact collaboratively with secure remote content. At present, the processors embedded within interaction pads have limited power, and do not support the Grid Application Toolkit (GAT) necessary to carry out grid operations [Allen et al. 2003]. For this reason, grid transactions will not be executed on the interaction pad themselves, but rather on remote proxying computers. The interaction pads embedded processor currently, a Rabbit RCM3010 core module are able to contact a remote server, and exchange their authentication information using AES encryption over the Rabbit s Ethernet port. The remote server will then select a grid-accessible computer as a proxy for the interaction pad. Henceforward, the interaction pad will send interaction events (e.g., RFID entrance/exit, button presses, and rotation events) to its remote proxy; and the remote proxy will translate these events into grid-mediated visualizations. These paragraphs give some flavor for technical aspects of our interface that relate to collaboration. In addition, collaboration raises a number of other interaction issues. As one example, ensuring the visibility of user s physical actions both to the controlling user and any observing users is one such concern. Kinesthetic manipulation of physical controls should allow manipulation of the interface while the eyes are focused on other (often graphical) objects of interest. However, it is important to provide feedback to indicate the actions that are being sensed and interpreted. Toward this, we are implementing a system of animated visual indicators. These are 3D graphical representations of parameters, menu selections, etc. which animate or fade onto the target display when the corresponding physical controls are modified, and animate away when input ceases. CONCLUSION AND FUTURE WORK We have described a tangible interface for facilitating collaborative visualization within immersive environments. This is based upon a system of pads, cards, and wheels that embody key digital operations, data, and parameters. We are currently midway completion of this interface. As such, our first priority is to complete implementation of the functionality we have described, and to deploy the pads for test usage. We have received support from the Max Planck Society for deploying a series of visualization artifacts for daily use by astrophysicists; the interface we have described has been developed in collaboration with these scientists. We are hopeful that the resulting functionality will prove valuable to a broad range of immersive and non-immersive visualization tasks. We are also considering a number of extensions to the functionality we have described, including special kinds of data cards; specialized (domain-specific) interaction pads; extensions to the parameter pad; and the use of rewritable data card surfaces, allowing automatic updating of labels when cards are bound to new contents. ACKNOWLEDGMENTS We thank the European Community (grant IST ) and the GridLab project for funding of this work. The hardware for our interface s first trial deployment at the Max-Planck Institute for Gravitational Physics (the Albert Einstein Institute) is funded by a BAR grant from the Max Planck Society. We thank Ed Seidel, Christa Hausmann- Jamin, Gabrielle Allen, Michael Koppitz, Frank Herrmann, Thomas Radke, and others at AEI for their enthusiasm, collaboration and support. Our work builds in part on earlier research at the MIT Media Laboratory under the leadership of Prof. Hiroshi Ishii, with support from the Things That Think consortium, IBM, Steelcase, Intel, and other sponsors. Fabrizio Iacopetti provided key PIC processor firmware used within the parameter pad. Indeed/TGS provided the Amira licenses used by our collaboration partners. REFERENCES 1. Allen, G., Davis, K., et al. (2003). Enabling Applications on the Grid: A GridLab Overview. In International Journal of High Performance Computing Applications: Special Issue on Grid Computing, August Balakrishnan, R., and Hinckley, K. (1999). The Role of Kinesthetic Reference Frames in Two Handed Input Performance. In Proc. of UIST 99, pp Bardini, T. (2000). Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing. Stanford: Stanford University Press, Keefe, D., Acevedo, D., et al. (2001). CavePainting: A Fully Immersive 3D Artistic Medium and Interactive Experience. In Proceedings of I3D Nelson, L., Ichimura, S., Pedersen, E., and Adams, L. (1999). Palette: a paper interface for giving presentations. In Proceedings of CHI 99, pp Parlett, D. (1999). Oxford History of Board Games. Oxford: Oxford University Press. 7. Rekimoto, J., Ullmer, B., and Oba, H. (2001). DataTiles: A Modular Platform for Mixed Physical and Graphical Interactions. In Proceedings of CHI 01, pp Schkolne, S., Pruett, M., and Schroeder, P. (2001). Surface Drawing: Creating Organic 3D Shapes with the Hand and Tangible Tools. In Proceedings of CHI Singer, A., Hindus, D., Stifelman, L., and White, S. (1999). Tangible Progress: Less is More in Somewire Audio Spaces. In Proc. of CHI Ullmer, B., Ishii, H., and Jacob, R. (2003). Tangible Query Interfaces: Physically Constrained Tokens for Manipulating Database Queries. In Proceedings of INTERACT Ullmer, B., and Ishii, H. (2001). Emerging Frameworks for Tangible User Interfaces. In HCI in the New Millenium, John M. Carroll, ed., pp Ullmer, B., Ishii, H., and Glas, D. (1998). mediablocks: Physical Containers, Transports, and Controls for Online Media. In Computer Graphics Proceedings (SIGGRAPH'98), 1998, pp

Organic UIs in Cross-Reality Spaces

Organic UIs in Cross-Reality Spaces Organic UIs in Cross-Reality Spaces Derek Reilly Jonathan Massey OCAD University GVU Center, Georgia Tech 205 Richmond St. Toronto, ON M5V 1V6 Canada dreilly@faculty.ocad.ca ragingpotato@gatech.edu Anthony

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

LCC 3710 Principles of Interaction Design. Readings. Tangible Interfaces. Research Motivation. Tangible Interaction Model.

LCC 3710 Principles of Interaction Design. Readings. Tangible Interfaces. Research Motivation. Tangible Interaction Model. LCC 3710 Principles of Interaction Design Readings Ishii, H., Ullmer, B. (1997). "Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms" in Proceedings of CHI '97, ACM Press. Ullmer,

More information

Midterm project proposal due next Tue Sept 23 Group forming, and Midterm project and Final project Brainstorming sessions

Midterm project proposal due next Tue Sept 23 Group forming, and Midterm project and Final project Brainstorming sessions Announcements Midterm project proposal due next Tue Sept 23 Group forming, and Midterm project and Final project Brainstorming sessions Tuesday Sep 16th, 2-3pm at Room 107 South Hall Wednesday Sep 17th,

More information

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

Tangible User Interfaces

Tangible User Interfaces Tangible User Interfaces Seminar Vernetzte Systeme Prof. Friedemann Mattern Von: Patrick Frigg Betreuer: Michael Rohs Outline Introduction ToolStone Motivation Design Interaction Techniques Taxonomy for

More information

CS 315 Intro to Human Computer Interaction (HCI)

CS 315 Intro to Human Computer Interaction (HCI) CS 315 Intro to Human Computer Interaction (HCI) Direct Manipulation Examples Drive a car If you want to turn left, what do you do? What type of feedback do you get? How does this help? Think about turning

More information

Token+Constraint Systems for Tangible Interaction with Digital Information

Token+Constraint Systems for Tangible Interaction with Digital Information Token+Constraint Systems for Tangible Interaction with Digital Information BRYGG ULLMER Zuse Institute Berlin (ZIB) HIROSHI ISHII MIT Media Laboratory and ROBERT J. K. JACOB Tufts University We identify

More information

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast. 11. Image Processing Image processing concerns about modifying or transforming images. Applications may include enhancing an image or adding special effects to an image. Here we will learn some of the

More information

Experience of Immersive Virtual World Using Cellular Phone Interface

Experience of Immersive Virtual World Using Cellular Phone Interface Experience of Immersive Virtual World Using Cellular Phone Interface Tetsuro Ogi 1, 2, 3, Koji Yamamoto 3, Toshio Yamada 1, Michitaka Hirose 2 1 Gifu MVL Research Center, TAO Iutelligent Modeling Laboratory,

More information

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Published in the Proceedings of CHI '97 Hiroshi Ishii and Brygg Ullmer MIT Media Laboratory Tangible Media Group 20 Ames Street,

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION Tweek: Merging 2D and 3D Interaction in Immersive Environments Patrick L Hartling, Allen D Bierbaum, Carolina Cruz-Neira Virtual Reality Applications Center, 2274 Howe Hall Room 1620, Iowa State University

More information

Advanced User Interfaces: Topics in Human-Computer Interaction

Advanced User Interfaces: Topics in Human-Computer Interaction Computer Science 425 Advanced User Interfaces: Topics in Human-Computer Interaction Week 04: Disappearing Computers 90s-00s of Human-Computer Interaction Research Prof. Roel Vertegaal, PhD Week 8: Plan

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Programme TOC. CONNECT Platform CONNECTION Client MicroStation CONNECT Edition i-models what is comming

Programme TOC. CONNECT Platform CONNECTION Client MicroStation CONNECT Edition i-models what is comming Bentley CONNECT CONNECT Platform MicroStation CONNECT Edition 1 WWW.BENTLEY.COM 2016 Bentley Systems, Incorporated 2016 Bentley Systems, Incorporated Programme TOC CONNECT Platform CONNECTION Client MicroStation

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Translucent Tangibles on Tabletops: Exploring the Design Space

Translucent Tangibles on Tabletops: Exploring the Design Space Translucent Tangibles on Tabletops: Exploring the Design Space Mathias Frisch mathias.frisch@tu-dresden.de Ulrike Kister ukister@acm.org Wolfgang Büschel bueschel@acm.org Ricardo Langner langner@acm.org

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Interaction Design. Chapter 9 (July 6th, 2011, 9am-12pm): Physical Interaction, Tangible and Ambient UI

Interaction Design. Chapter 9 (July 6th, 2011, 9am-12pm): Physical Interaction, Tangible and Ambient UI Interaction Design Chapter 9 (July 6th, 2011, 9am-12pm): Physical Interaction, Tangible and Ambient UI 1 Physical Interaction, Tangible and Ambient UI Shareable Interfaces Tangible UI General purpose TUI

More information

What was the first gestural interface?

What was the first gestural interface? stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things

More information

Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces

Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces James Patten MIT Media Lab 20 Ames St. Cambridge, Ma 02139 +1 857 928 6844 jpatten@media.mit.edu Ben Recht MIT Media Lab

More information

Attorney Docket No Date: 25 April 2008

Attorney Docket No Date: 25 April 2008 DEPARTMENT OF THE NAVY NAVAL UNDERSEA WARFARE CENTER DIVISION NEWPORT OFFICE OF COUNSEL PHONE: (401) 832-3653 FAX: (401) 832-4432 NEWPORT DSN: 432-3853 Attorney Docket No. 98580 Date: 25 April 2008 The

More information

VICs: A Modular Vision-Based HCI Framework

VICs: A Modular Vision-Based HCI Framework VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project

More information

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction Unit 23 Outcomes Know the impact of HCI on society, the economy and culture Understand the fundamental principles of interface

More information

synchrolight: Three-dimensional Pointing System for Remote Video Communication

synchrolight: Three-dimensional Pointing System for Remote Video Communication synchrolight: Three-dimensional Pointing System for Remote Video Communication Jifei Ou MIT Media Lab 75 Amherst St. Cambridge, MA 02139 jifei@media.mit.edu Sheng Kai Tang MIT Media Lab 75 Amherst St.

More information

HELPING THE DESIGN OF MIXED SYSTEMS

HELPING THE DESIGN OF MIXED SYSTEMS HELPING THE DESIGN OF MIXED SYSTEMS Céline Coutrix Grenoble Informatics Laboratory (LIG) University of Grenoble 1, France Abstract Several interaction paradigms are considered in pervasive computing environments.

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

! Computation embedded in the physical spaces around us. ! Ambient intelligence. ! Input in the real world. ! Output in the real world also

! Computation embedded in the physical spaces around us. ! Ambient intelligence. ! Input in the real world. ! Output in the real world also Ubicomp? Ubicomp and Physical Interaction! Computation embedded in the physical spaces around us! Ambient intelligence! Take advantage of naturally-occurring actions and activities to support people! Input

More information

Room With A View (RWAV): A Metaphor For Interactive Computing

Room With A View (RWAV): A Metaphor For Interactive Computing Room With A View (RWAV): A Metaphor For Interactive Computing September 1990 Larry Koved Ted Selker IBM Research T. J. Watson Research Center Yorktown Heights, NY 10598 Abstract The desktop metaphor demonstrates

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

SECTION 2. Computer Applications Technology

SECTION 2. Computer Applications Technology SECTION 2 Computer Applications Technology 2.1 What is Computer Applications Technology? Computer Applications Technology is the study of the integrated components of a computer system (such as hardware,

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

CHAPTER 1. INTRODUCTION 16

CHAPTER 1. INTRODUCTION 16 1 Introduction The author s original intention, a couple of years ago, was to develop a kind of an intuitive, dataglove-based interface for Computer-Aided Design (CAD) applications. The idea was to interact

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

CHAPTER 1 DESIGN AND GRAPHIC COMMUNICATION

CHAPTER 1 DESIGN AND GRAPHIC COMMUNICATION CHAPTER 1 DESIGN AND GRAPHIC COMMUNICATION Introduction OVERVIEW A new machine structure or system must exist in the mind of the engineer or designer before it can become a reality. The design process

More information

Tangible interaction : A new approach to customer participatory design

Tangible interaction : A new approach to customer participatory design Tangible interaction : A new approach to customer participatory design Focused on development of the Interactive Design Tool Jae-Hyung Byun*, Myung-Suk Kim** * Division of Design, Dong-A University, 1

More information

STRANDS AND STANDARDS

STRANDS AND STANDARDS STRANDS AND STANDARDS Digital Literacy Course Description This course is a foundation to computer literacy. Students will have opportunities to use technology and develop skills that encourage creativity,

More information

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS Xianjun Sam Zheng, George W. McConkie, and Benjamin Schaeffer Beckman Institute, University of Illinois at Urbana Champaign This present

More information

Audiopad: A Tag-based Interface for Musical Performance

Audiopad: A Tag-based Interface for Musical Performance Published in the Proceedings of NIME 2002, May 24-26, 2002. 2002 ACM Audiopad: A Tag-based Interface for Musical Performance James Patten Tangible Media Group MIT Media Lab Cambridge, Massachusetts jpatten@media.mit.edu

More information

Ortelia Set Designer User Manual

Ortelia Set Designer User Manual Ortelia Set Designer User Manual http://ortelia.com 1 Table of Contents Introducing Ortelia Set Designer...3 System Requirements...4 1. Operating system:... 4 2. Hardware:... 4 Minimum Graphics card specification...4

More information

New Metaphors in Tangible Desktops

New Metaphors in Tangible Desktops New Metaphors in Tangible Desktops A brief approach Carles Fernàndez Julià Universitat Pompeu Fabra Passeig de Circumval lació, 8 08003 Barcelona chaosct@gmail.com Daniel Gallardo Grassot Universitat Pompeu

More information

User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure

User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure Les Nelson, Elizabeth F. Churchill PARC 3333 Coyote Hill Rd. Palo Alto, CA 94304 USA {Les.Nelson,Elizabeth.Churchill}@parc.com

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

EnSight in Virtual and Mixed Reality Environments

EnSight in Virtual and Mixed Reality Environments CEI 2015 User Group Meeting EnSight in Virtual and Mixed Reality Environments VR Hardware that works with EnSight Canon MR Oculus Rift Cave Power Wall Canon MR MR means Mixed Reality User looks through

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

ScrollPad: Tangible Scrolling With Mobile Devices

ScrollPad: Tangible Scrolling With Mobile Devices ScrollPad: Tangible Scrolling With Mobile Devices Daniel Fällman a, Andreas Lund b, Mikael Wiberg b a Interactive Institute, Tools for Creativity Studio, Tvistev. 47, SE-90719, Umeå, Sweden b Interaction

More information

UNIT-III LIFE-CYCLE PHASES

UNIT-III LIFE-CYCLE PHASES INTRODUCTION: UNIT-III LIFE-CYCLE PHASES - If there is a well defined separation between research and development activities and production activities then the software is said to be in successful development

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

Collaborative Visualization in Augmented Reality

Collaborative Visualization in Augmented Reality Collaborative Visualization in Augmented Reality S TUDIERSTUBE is an augmented reality system that has several advantages over conventional desktop and other virtual reality environments, including true

More information

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote 8 th International LS-DYNA Users Conference Visualization Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote Todd J. Furlong Principal Engineer - Graphics and Visualization

More information

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS NSF Lake Tahoe Workshop on Collaborative Virtual Reality and Visualization (CVRV 2003), October 26 28, 2003 AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS B. Bell and S. Feiner

More information

Physical Interaction and Multi-Aspect Representation for Information Intensive Environments

Physical Interaction and Multi-Aspect Representation for Information Intensive Environments Proceedings of the 2000 IEEE International Workshop on Robot and Human Interactive Communication Osaka. Japan - September 27-29 2000 Physical Interaction and Multi-Aspect Representation for Information

More information

Software Development & Education Center NX 8.5 (CAD CAM CAE)

Software Development & Education Center NX 8.5 (CAD CAM CAE) Software Development & Education Center NX 8.5 (CAD CAM CAE) Detailed Curriculum Overview Intended Audience Course Objectives Prerequisites How to Use This Course Class Standards Part File Naming Seed

More information

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks 3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks David Gauldie 1, Mark Wright 2, Ann Marie Shillito 3 1,3 Edinburgh College of Art 79 Grassmarket, Edinburgh EH1 2HJ d.gauldie@eca.ac.uk, a.m.shillito@eca.ac.uk

More information

Context Sensitive Interactive Systems Design: A Framework for Representation of contexts

Context Sensitive Interactive Systems Design: A Framework for Representation of contexts Context Sensitive Interactive Systems Design: A Framework for Representation of contexts Keiichi Sato Illinois Institute of Technology 350 N. LaSalle Street Chicago, Illinois 60610 USA sato@id.iit.edu

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

rainbottles: gathering raindrops of data from the cloud

rainbottles: gathering raindrops of data from the cloud rainbottles: gathering raindrops of data from the cloud Jinha Lee MIT Media Laboratory 75 Amherst St. Cambridge, MA 02142 USA jinhalee@media.mit.edu Mason Tang MIT CSAIL 77 Massachusetts Ave. Cambridge,

More information

For More Information on Spectrum Bridge White Space solutions please visit

For More Information on Spectrum Bridge White Space solutions please visit COMMENTS OF SPECTRUM BRIDGE INC. ON CONSULTATION ON A POLICY AND TECHNICAL FRAMEWORK FOR THE USE OF NON-BROADCASTING APPLICATIONS IN THE TELEVISION BROADCASTING BANDS BELOW 698 MHZ Publication Information:

More information

PhantomParasol: a parasol-type display transitioning from ambient to detailed

PhantomParasol: a parasol-type display transitioning from ambient to detailed PhantomParasol: a parasol-type display transitioning from ambient to detailed Koji Tsukada 1 and Toshiyuki Masui 1 National Institute of Advanced Industrial Science and Technology (AIST) Akihabara Daibiru,

More information

Ubiquitous Computing. michael bernstein spring cs376.stanford.edu. Wednesday, April 3, 13

Ubiquitous Computing. michael bernstein spring cs376.stanford.edu. Wednesday, April 3, 13 Ubiquitous Computing michael bernstein spring 2013 cs376.stanford.edu Ubiquitous? Ubiquitous? 3 Ubicomp Vision A new way of thinking about computers in the world, one that takes into account the natural

More information

immersive visualization workflow

immersive visualization workflow 5 essential benefits of a BIM to immersive visualization workflow EBOOK 1 Building Information Modeling (BIM) has transformed the way architects design buildings. Information-rich 3D models allow architects

More information

Autodesk Advance Steel. Drawing Style Manager s guide

Autodesk Advance Steel. Drawing Style Manager s guide Autodesk Advance Steel Drawing Style Manager s guide TABLE OF CONTENTS Chapter 1 Introduction... 5 Details and Detail Views... 6 Drawing Styles... 6 Drawing Style Manager... 8 Accessing the Drawing Style

More information

Assembly Set. capabilities for assembly, design, and evaluation

Assembly Set. capabilities for assembly, design, and evaluation Assembly Set capabilities for assembly, design, and evaluation I-DEAS Master Assembly I-DEAS Master Assembly software allows you to work in a multi-user environment to lay out, design, and manage large

More information

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Adiyan Mujibiya The University of Tokyo adiyan@acm.org http://lab.rekimoto.org/projects/mirage-exploring-interactionmodalities-using-off-body-static-electric-field-sensing/

More information

NOSTOS: A Paper Based Ubiquitous Computing Healthcare Environment to Support Data Capture and Collaboration

NOSTOS: A Paper Based Ubiquitous Computing Healthcare Environment to Support Data Capture and Collaboration NOSTOS: A Paper Based Ubiquitous Computing Healthcare Environment to Support Data Capture and Collaboration Magnus Bång, Anders Larsson, and Henrik Eriksson Department of Computer and Information Science,

More information

The use of gestures in computer aided design

The use of gestures in computer aided design Loughborough University Institutional Repository The use of gestures in computer aided design This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: CASE,

More information

Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit

Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit Alan Esenther and Kent Wittenburg TR2005-105 September 2005 Abstract

More information

NICE: Combining Constructionism, Narrative, and Collaboration in a Virtual Learning Environment

NICE: Combining Constructionism, Narrative, and Collaboration in a Virtual Learning Environment In Computer Graphics Vol. 31 Num. 3 August 1997, pp. 62-63, ACM SIGGRAPH. NICE: Combining Constructionism, Narrative, and Collaboration in a Virtual Learning Environment Maria Roussos, Andrew E. Johnson,

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Conversational Gestures For Direct Manipulation On The Audio Desktop

Conversational Gestures For Direct Manipulation On The Audio Desktop Conversational Gestures For Direct Manipulation On The Audio Desktop Abstract T. V. Raman Advanced Technology Group Adobe Systems E-mail: raman@adobe.com WWW: http://cs.cornell.edu/home/raman 1 Introduction

More information

EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL

EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL EOS 80D (W) Wireless Function Instruction Manual ENGLISH INSTRUCTION MANUAL Introduction What You Can Do Using the Wireless Functions This camera s wireless functions let you perform a range of tasks wirelessly,

More information

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Sultan A. Alharthi Play & Interactive Experiences for Learning Lab New Mexico State University Las Cruces, NM 88001, USA salharth@nmsu.edu

More information

X11 in Virtual Environments ARL

X11 in Virtual Environments ARL COMS W4172 Case Study: 3D Windows/Desktops 2 Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 February 8, 2018 1 X11 in Virtual

More information

REPRESENTATION, RE-REPRESENTATION AND EMERGENCE IN COLLABORATIVE COMPUTER-AIDED DESIGN

REPRESENTATION, RE-REPRESENTATION AND EMERGENCE IN COLLABORATIVE COMPUTER-AIDED DESIGN REPRESENTATION, RE-REPRESENTATION AND EMERGENCE IN COLLABORATIVE COMPUTER-AIDED DESIGN HAN J. JUN AND JOHN S. GERO Key Centre of Design Computing Department of Architectural and Design Science University

More information

Nokia Technologies in 2016 Technology to move us forward.

Nokia Technologies in 2016 Technology to move us forward. Business overview Nokia Technologies in 2016 Technology to move us forward. Our advanced technology development and licensing business group, Nokia Technologies, was established with two main objectives:

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

The Chatty Environment Providing Everyday Independence to the Visually Impaired

The Chatty Environment Providing Everyday Independence to the Visually Impaired The Chatty Environment Providing Everyday Independence to the Visually Impaired Vlad Coroamă and Felix Röthenbacher Distributed Systems Group Institute for Pervasive Computing Swiss Federal Institute of

More information

Dual-Reality Objects

Dual-Reality Objects Dual-Reality Objects Randall B. Smith Sun Microsystems Laboratories We have of course created a new universe. Our agglomeration of networked computers enables us to move, copy, modify, and store away documents,

More information

A Brief Survey of HCI Technology. Lecture #3

A Brief Survey of HCI Technology. Lecture #3 A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command

More information

COMS W4172 Design Principles

COMS W4172 Design Principles COMS W4172 Design Principles Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 January 25, 2018 1 2D & 3D UIs: What s the

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Virtual Reality Devices in C2 Systems

Virtual Reality Devices in C2 Systems Jan Hodicky, Petr Frantis University of Defence Brno 65 Kounicova str. Brno Czech Republic +420973443296 jan.hodicky@unbo.cz petr.frantis@unob.cz Virtual Reality Devices in C2 Systems Topic: Track 8 C2

More information

ADDITIVE MANUFACTURING FOR INNOVATIVE DESIGN AND PRODUCTION

ADDITIVE MANUFACTURING FOR INNOVATIVE DESIGN AND PRODUCTION FOR INNOVATIVE DESIGN AND PRODUCTION INTRODUCTION The implications of additive manufacturing (AM), also known as 3D printing, span the entire product lifecycle and compel us to reimagine how products are

More information

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments Mario Doulis, Andreas Simon University of Applied Sciences Aargau, Schweiz Abstract: Interacting in an immersive

More information

UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS

UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS UNIT 5a STANDARD ORTHOGRAPHIC VIEW DRAWINGS 5.1 Introduction Orthographic views are 2D images of a 3D object obtained by viewing it from different orthogonal directions. Six principal views are possible

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Instructions.

Instructions. Instructions www.itystudio.com Summary Glossary Introduction 6 What is ITyStudio? 6 Who is it for? 6 The concept 7 Global Operation 8 General Interface 9 Header 9 Creating a new project 0 Save and Save

More information

EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment

EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment Hideki Koike 1, Shin ichiro Nagashima 1, Yasuto Nakanishi 2, and Yoichi Sato 3 1 Graduate School of Information Systems,

More information