Building a bimanual gesture based 3D user interface for Blender

Similar documents
6 System architecture

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Application of 3D Terrain Representation System for Highway Landscape Design

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

3D Data Navigation via Natural User Interfaces

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MEDIA AND INFORMATION

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Collaborative Flow Field Visualization in the Networked Virtual Laboratory

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

EnSight in Virtual and Mixed Reality Environments

Immersive Real Acting Space with Gesture Tracking Sensors

The use of gestures in computer aided design

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Real-time scenegraph creation and manipulation in an immersive environment using an iphone

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

VR4D: An Immersive and Collaborative Experience to Improve the Interior Design Process

INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components

Chapter 1 - Introduction

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Shared Virtual Environments for Telerehabilitation

Interactive intuitive mixed-reality interface for Virtual Architecture

Haptic Rendering and Volumetric Visualization with SenSitus

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

HUMAN COMPUTER INTERFACE

Virtual Grasping Using a Data Glove

Guidelines for choosing VR Devices from Interaction Techniques

Virtual Environments. Ruth Aylett

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Experience of Immersive Virtual World Using Cellular Phone Interface


DATA GLOVES USING VIRTUAL REALITY

Interaction Techniques in VR Workshop for interactive VR-Technology for On-Orbit Servicing

VR/AR Concepts in Architecture And Available Tools

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Implementing 3D-experience inside a car configurator Rasmus Dahlkvist

Everything moves. Qualisys

Dr Antony Robotham - Executive Director EWG-DSS Liverpool April 2012

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Software Design Document

Spatial Mechanism Design in Virtual Reality With Networking

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Advancements in Gesture Recognition Technology

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Immersive Authoring of Tangible Augmented Reality Applications

HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS

TEAM JAKD WIICONTROL

Geo-Located Content in Virtual and Augmented Reality

BoBoiBoy Interactive Holographic Action Card Game Application

A Virtual Reality Framework to Validate Persuasive Interactive Systems to Change Work Habits

The Use of Virtual Reality System for Education in Rural Areas

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Virtual Reality in E-Learning Redefining the Learning Experience

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

X3D Multi-user Virtual Environment Platform for Collaborative Spatial Design

Modo VR Technical Preview User Guide

MRT: Mixed-Reality Tabletop

3D Visualization and 3D and Voice Interaction in Air Traffic Management

Virtual- and Augmented Reality in Education Intel Webinar. Hannes Kaufmann

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

Attorney Docket No Date: 25 April 2008

Years 9 and 10 standard elaborations Australian Curriculum: Digital Technologies

Haptic Feedback in Mixed-Reality Environment

Paper Prototyping Kit

ATLASrift - a Virtual Reality application

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

DETC2001/CIE21267 DESIGN SYNTHESIS IN A VIRTUAL ENVIRONMENT

VR Haptic Interfaces for Teleoperation : an Evaluation Study

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

3D interaction techniques in Virtual Reality Applications for Engineering Education

Approaches to the Successful Design and Implementation of VR Applications

Immersive Simulation in Instructional Design Studios

Construction of visualization system for scientific experiments

Beyond: collapsible tools and gestures for computational design

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Immersive Guided Tours for Virtual Tourism through 3D City Models

Visual Data Mining and the MiniCAVE Jürgen Symanzik Utah State University, Logan, UT

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

A Comparison of Virtual Reality Displays - Suitability, Details, Dimensions and Space

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Transcription:

Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory

Content 1. Background 2. Prototypes and testing with Blender 1. Virtual reality (VR) system setup 2. Blender to VR 3. Usability study 4. Building bimanual UI 3. Future work and goals

Background HandsOn research project On-going since August 2006 to the end of 2008 Participating research units Helsinki University of Technology Tampere University of Technology Helsinki University of Art and Design Funding Finnish Funding Agency for Technology and Innovation, TEKES Finnish companies ranging from Visual effects and film post production to industrial manufacturing

Background HandsOn research project Focus on studying 3D user interfaces in the context of computer-aided design and animation tasks Technologies involved Optical tracking Haptic force feedback Stereoscopic displays and VR installations Three dimensional user interfaces, interaction models and embodied interaction Analysis of design work processes

Background Research motivation Study how VR technologies can be applied in context of 3D modeling and animation Study alternative interaction models to support creative 3D design Create more intuitive and easier to learn user interfaces Increase overall understanding of 3D user interfaces

Background This presentation focuses on prototype and Blender related development VR system setup First prototype Existing 3D modeling application use in VR Usability study comparing modeling with traditional 2D desktop user interface with working in immersive VE Bimanual user interface design Iterative implementation of designed UI with Blender

VR System Setup Laboratory has previously developed a light weight CAVE-like VR system called Upponurkka (Lokki et al. 2006)

VR System Setup New, more generic rendering framework was needed for using existing software Chromium (Humphreys et al, 2002) captures OpenGL stream by replacing the OpenGL driver on PC running the application Application OpenGL OpenGL driver Chromium Render server Render server Render server Render server

VR System Setup New modules were implemented for Chromium Viewpoint modification for each rendering process according to the head tracking Keystone correction of projected images Unmodified OpenGL application can be rendered on VR system Blender required constant 3D view redrawing Interfaced Blender with optical tracking system 3D pointer device for input Additional thread for receiving tracker data Bypass the blocking event queue wait

System setup Blender to VR Application PC (WinXP / Linux) Head & 3D cursor location (UDP) Render PC 1 (Linux) Tracker Camera 1 Camera 2 Application (Blender 3D) OpenGL Chromium Render server Projector Projector Chromium (OpenGL32.dll) OpenGL + view / projection matrix Render PC 2 (Linux) Chromium Render server Projector Projector

Usability Study Compare the use of Blender with original 2D UI with the use in VR with 3D UI Automatic instrumentation to collect user performance statistics Additional visualization for the vertex locations to guide test users Results published in Intuititon 2007 conference T. Harviainen,, L. Svan,, T. Takala Usability Testing of Virtual Reality Aided Design: Framework for a Prototype Development and a Test Scenario 4 th INTUITION International Conference on Virtual Reality and Virtual Environments proceedings, 4-54 5 October 2007, Athens, Greece, ISBN- 978-960 960-254-665-9

Test Scenario Users were asked to shape a 3D object to match an object shown for reference Only direct vertex manipulation used. Shaping done by translating one vertex at a time View control In 3D UI automatically by head tracking In 2D UI with normal viewport controls Three test cases

Test Scenario

Data Collection Quantitative data All user actions collected with timestamps Times to complete each task Qualitative data Structured questionnaire Interview Small scale pilot test 6 test users with varying CAD experience

Preliminary Results 3000 Time to complete the task (s) 2500 2000 1500 1000 500 0 User 1 User 2 User 3 User 4 User 5 User 6 Task 2 2D 186,656 188,766 222,156 89,969 88,25 264,047 Task 3 2D 220,172 368,406 274,438 38,86 137,125 362,5 Task 2 3D 475,547 2447,828 664,829 228,625 609,782 111,828 Task 3 3D 373,922 677,281 415,016 172,875 248,407 146,719 Task 2 2D Task 3 2D Task 2 3D Task 3 3D

Preliminary Results Qualitative results: Overall users regarded 3D UI to be more intuitive and easier to learn Users still preferred to work on 2D UI No conclusive quantitative results can be given at this point Pilot test points out needs for improvement in our test setup Technical issues concerning tracking and stereoscopic rendering Longitudinal study to minimize the effect of previous experience on 2D UI

Building a Bimanual 3D UI Currently we are implementing a bimanual gesture based 3D UI for modeling and character animation using Blender First version of the 3D UI hopefully soon ready for preliminary testing

3D UI Design Focus on improving creative use of CAD More suitable for initial drafting phase when rough and arbitrary overall shapes are designed Overall for designing organic free-form form shapes First tools for polygonal modeling Aim to provide tools that have same expressive power as polygonal modeling tools found commonly in current 3D modeling software

3D UI Design Small set of hand gestures and hand movements used as an input Reduce the use of menus or keyboard shortcuts Context sensitive gesture commands and direct manipulation

3D UI Implementation Hand tracking interfaced with Blender 6 degrees of freedom input with both hands Bend angles for all fingers Generic UDP interface for receiving data New state management for handling bimanual operations Additional hand data polling and processing loop Concurrent modeling commands, edit modes and selections Several individual on-going operations can be executed in parallel Original 2D UI can be used in conjunction with the 3D UI

3D UI Implementation

Future Work Usability studies Iterative development (design-implementation implementation-testing) testing) of 3D UI Additional features and interaction models Sub-division surfaces Sculpting Testing with various VR system setups Provide access to 3D UI technology to wider audience We hope to spark interest also outside the project group

Thank You! contact: tatu.harviainen@tkk.fi http://handson.uiah.fi handson.uiah.fi