Introduction to Mediated Reality

Similar documents
Wearable Computing: Towards Humanistic Intelligence

Interior Design using Augmented Reality Environment

Towards Wearable Gaze Supported Augmented Cognition

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

EMPOWERING THE CONNECTED FIELD FORCE WORKER WITH ADVANCED ANALYTICS MATTHEW SHORT ACCENTURE LABS

UNIT 2 TOPICS IN COMPUTER SCIENCE. Emerging Technologies and Society

September November 2010

Enhancing Shipboard Maintenance with Augmented Reality

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

Virtual Tactile Maps

ELG 5121/CSI 7631 Fall Projects Overview. Projects List

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

First day quiz Introduction to HCI

VR based HCI Techniques & Application. November 29, 2002

Mario Romero 2014/11/05. Multimodal Interaction and Interfaces Mixed Reality

This list supersedes the one published in the November 2002 issue of CR.

Who are these people? Introduction to HCI

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

Over the past 20 years, wearable computing has emerged as the perfect tool for

Paper on: Optical Camouflage

Harnessing the 4th Industrial Revolution. Professor Mark Esposito Harvard University & Nexus

Technology designed to empower people

Why interest in visual perception?

Haptic holography/touching the ethereal Page, Michael

Digital image processing vs. computer vision Higher-level anchoring

6 Ubiquitous User Interfaces

HELPING THE DESIGN OF MIXED SYSTEMS

YDDON. Humans, Robots, & Intelligent Objects New communication approaches

Insights into High-level Visual Perception

What will the robot do during the final demonstration?

Integrated Vision and Sound Localization

IT and Systems Science Transformational Impact on Technology, Society, Work, Life, Education, Training

IMAGE PROCESSING PAPER PRESENTATION ON IMAGE PROCESSING

TABLE OF CONTENTS INTRODUCTION...04 PART I - HEALTH LEARNING...08 PART II - DEVICE LEARNING...12 PART III - BUILD...16 PART IV - DATA COLLECTION...

Haptic Holography/Touching the Ethereal

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

Ubiquitous Home Simulation Using Augmented Reality

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

A vision system for providing 3D perception of the environment via: transcutaneous electro-neural stimulation

Visual Resonator: Interface for Interactive Cocktail Party Phenomenon

Access Invaders: Developing a Universally Accessible Action Game

Recent Progress on Wearable Augmented Interaction at AIST

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Intelligent interaction

Mixed / Augmented Reality in Action

Effective Iconography....convey ideas without words; attract attention...

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Projection Based HCI (Human Computer Interface) System using Image Processing

Haptic presentation of 3D objects in virtual reality for the visually disabled

Mission Space. Value-based use of augmented reality in support of critical contextual environments

Wearable Technology Show 2017: New Developments in Smart Apparel and Mixed Reality

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

VICs: A Modular Vision-Based HCI Framework

Logic Programming. Dr. : Mohamed Mostafa

Seeing with the Brain Paul Bach-y-Rita, Mitchell E. Tyler & Kurt A. Kaczmarek Published online: 13 Nov 2009.

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

OUR RELENTLESS PURSUIT: Applying 3D Video / VR /AR ON THE ROAD TO ZERO

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Medical Robotics. Part II: SURGICAL ROBOTICS

What is Augmented Reality?

COPYRIGHTED MATERIAL. Overview

sclera pupil What happens to light that enters the eye?

Geo-Located Content in Virtual and Augmented Reality

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

COPYRIGHTED MATERIAL OVERVIEW 1

Available theses in robotics (March 2018) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin

Reinventing movies How do we tell stories in VR? Diego Gutierrez Graphics & Imaging Lab Universidad de Zaragoza

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

Electrical and Automation Engineering, Fall 2018 Spring 2019, modules and courses inside modules.

Perception platform and fusion modules results. Angelos Amditis - ICCS and Lali Ghosh - DEL interactive final event

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Building Spatial Experiences in the Automotive Industry

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

Yokohama City University lecture INTRODUCTION TO HUMAN VISION Presentation notes 7/10/14

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Augmented Presentation of Animal and Environmental Data

Challenges for AI: Mobile Robots on Construction Sites. Tim Detert

PROJECT BAT-EYE. Developing an Economic System that can give a Blind Person Basic Spatial Awareness and Object Identification.

Aiming to Realize People-Oriented IoT and an 8K Ecosystem

Sensation & Perception

Summary of the Report by Study Group for Higher Quality of Life through Utilization of IoT and Other Digital Tools Introduced into Lifestyle Products

ARTIFICIAL INTELLIGENCE - ROBOTICS

Service Robots in an Intelligent House

Cognitive Science: What Is It, and How Can I Study It at RPI?

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Glossary of terms. Short explanation

Overview: Emerging Technologies and Issues

OASIS. The new generation of BCI

Lecture 4 Foundations and Cognitive Processes in Visual Perception From the Retina to the Visual Cortex

Industry 4.0: the new challenge for the Italian textile machinery industry

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Accelerating Collective Innovation: Investing in the Innovation Landscape

Introduction to Haptics

제 1 HCI Korea, 증강현실전시기술의적용사례및분석. Woontack Woo ( 우운택 ), Ph.D. KAIST GSCT UVR Lab. Tw

Transcription:

INTERNATIONAL JOURNAL OF HUMAN COMPUTER INTERACTION, 15(2), 205 208 Copyright 2003, Lawrence Erlbaum Associates, Inc. Introduction to Mediated Reality Steve Mann Department of Electrical and Computer Engineering University of Toronto Woodrow Barfield School of Law Personal Cybernetics and Humanistic Intelligence are new and rapidly growing fields of research in the area of human-computer interaction. These areas of research involve personal wearable imaging devices with intelligence that arises from the existence of a human user in the feedback loop of a computational process, in which the human user and the computational process are inextricably intertwined. Unlike the typical goal of Artificial Intelligence (AI), which is to emulate human intelligence with computers, Humanistic Intelligence (HI) creates a close synergy in which Intelligent Signal Processing is used to harness the processing power of the human brain. HI gives rise to a symbiosis between human and computer in which each uses the other within a closely coupled signal processing feedback loop. The computer performs basic low-level signal processing functions, using data obtained from a first person perspective (wearable camera, microphones, miniature wearable radar, biosensors, etc.), while the human performs high-level cognitive tasks, providing a computationally mediated reality. Personal Cybernetics and HI form a basis for augmenting, deliberately diminishing, or otherwise mediating the visual perception of reality. Although the visual modality is most often used in mediated reality systems based on current technology, other modalities such as touch, taste, and olfaction may be mediated as well. In the visual domain, a system that can augment, diminish, or otherwise alter the visual perception of reality is called a Reality Mediator (RM). By way of explanation, virtual reality creates a completely computer-generated environment, augmented reality uses an existing, real-life environment, and adds computer-generated information (virtual objects) thereto, diminished reality filters the environment (i.e., it alters real objects, replaces them with virtual ones, or renders them imperceptible), and mediated reality combines augmented and dimin- Requests for reprints should be sent to Steve Mann, University of Toronto, Department of Electrical and Computer Engineering, Room SF 2001, 10 King s College Road, Toronto, ONT M53 3G4 Canada. E-mail: mann@eecg.toronto.edu

206 Mann and Barfield ished reality. Reality Mediators are useful, for example, in applications involving the visually challenged. In such applications, RMs simplify the visual information presented to the wearer. Mediated reality may also serve as a framework for filtering out real-world spam (advertising billboards, etc.) and for allowing individuals to communicate with one another by altering each other s perception of reality. In mediated reality, a wearer of the apparatus (e.g., glasses, sensors) may, for example, be shopping at a grocery store while a remote spouse/friend can view a transmitted video in a stabilized coordinate system and then draw directly on the retina of the wearer of the glasses using a directed (computer controlled) laser beam, such as that made possible with the Eye Tap (TM) technology when connected to a Xybernaut (TM) wearable computer system. In this way, a remote individual could collaborate with the wearer of the apparatus in everyday experiences like shopping for a new car, sightseeing, and so forth. In addition to the current direction of research in HI (e.g., Personal Imaging, and the field of Personal Technologies in general), this special issue includes articles in the field of Rehabilitative Medicine. In particular, prosthetic devices that improve the quality of the everyday lives of the visually challenged whether at work, at play, or just walking down the street constitute an important part of this new research field. Mediated reality is at the intersection of four related fields: 1. Telephone, wireless communications, videoconferencing, and so forth. 2. Photography/videography, electronic newsgathering (ENG), and so forth. 3. Visual science, (e.g., optometry, visual aids, and night vision systems). 4. Human-computer interaction (HCI). The special issue is comprised of six articles documenting research generally on the following topics. These topics represent some of the current areas of research and interest for the emerging field of mediated reality. 1. Image processing for Personal Imaging systems. 2. Signal processing for Wearable Cybernetics. 3. Wearable visual information processing. 4. Wearable applications of image processing. 5. Video-based personal safety devices for use by ordinary citizens to help them participate in crime reduction. 6. Fusion of wearable video and other sensing modalities. 7. Visual and other modality prostheses. 8. Videographic/photographic memory prostheses. 9. Visualization and data dissemination from personal imaging systems. 10. Innovative vision-based devices and systems. 11. Innovative eyewear. 12. Vision aids for the blind or partially sighted. 13. Night Vision Goggles (NVG) and low-light visual aids. 14. Vision aids for those with visual memory or visual processing disability. 15. Innovative wearable video display or processing technologies. 16. Visual pattern recognition systems suitable for use in personal imaging.

Introduction to Mediated Reality 207 17. Computer supported collaborative living. 18. VideoOrbits image processing and algebraic projective geometr. 19. Collaborative cybernetic photography/videography and shared visual space. 20. New paradigms in photography, videography, photojournalism, and wearable electronic new gathering. 21. Signal processing of Eye Tap video signals and systems. 22. Issues in User-Interface Design. 23. Empirical studies. In the first article, Early Experiences of Visual Memory Prosthesis for Supporting Episodic Memory, Hoisko (this issue) describes a visual memory prosthesis based on a wearable camera system, which has its motivations in mediated reality. Indeed, a very important aspect of mediated reality is that of temporally mediated reality (e.g., computer induced flashbacks of previously seen items can be used as a photographic visual memory recall). In A Wearable Mobility Aid for Low Vision Using Scene Classification in a Markov Random Field Model Framework, Everingham, Thomas, and Troscianko (this issue) describe a system of great potential use to the visually challenged. Further, in Testing Visual Search Performance Using Retinal Light Scanning as a Future Wearable Low Vision Aid, Lin, Seibel, and Furness (this issue) describe the results of a study conducted to determine if a display and camera system can be used as a low vision aid. In this study, camera bearing headgear with a head worn display provide a classic form of reality mediator, which shows great promise for the visually challenged. Next, in Mediated Reality Through Glasses or Binoculars? Exploring Use Models of Wearable Computing in the Context of Aircraft Maintenance, Fallman (this issue) explores use models of wearable computing in the context of aircraft maintenance. Findings of an interpretive case study conducted at Scandinavian Airlines Systems, the largest commercial airline in Scandinavia, are presented, with respect to the usefulness of mediated reality in a real world setting. In Seeing with the Brain, Bach-y-Rita (this issue) addresses the plasticity of the brain, and the process of visual learning. In this work, images are augmented by nonsynaptic and other brain plasticity mechanisms, within the context of the cognitive value of information passing into the brain. Finally, in The Internet Chair, Cohen (this issue) describes a new kind of user interface based on an Internet connected chair. This represents a different kind of mediated reality than one might think of within the classic wearable camera and display apparatus. The Internet Chair is a spatial media interface, which combines the closeness of clothing and eyeglasses with the external properties of an intelligent environment. Thus, it shows an example of a departure from the traditional clothing and eyeglass based forms of Reality Mediators. REFERENCES Bach-y-Rita, P., Tyler, M. E., & Kaczmarek, K. (2003). Seeing with the brain. International Journal of Human-Computer Interaction, 15, 285 295.

208 Mann and Barfield Cohen, M. (2003). The internet chair. International Journal of Human-Computer Interaction, 15, 297 311. Everingham, M. R., Thomas, B. T., & Troskianko, T. (2003). A wearable mobility aid for low vision using scene classification in a Markov random field model framework. International Journal of Human-Computer Interaction, 15, 231 244. Fallman, D. (2003). Mediated reality through glasses or binoculars? Exploring use models of wearable computing in the context of aircraft maintenance. International Journal of Human-Computer Interaction, 15, 265 284. Hoisko, J. (2003). Early experiences of visual memory prosthesis for supporting episodic memory. International Journal of Human-Computer Interaction, 15, 209 230. Lin, S.-K., Seibel, E. J., & Furness, T. A., III. (2003). Testing visual search performance using retinal light scanning as a future wearable low vision aid. International Journal of Human-Computer Interaction, 15, 245 263.