Week (date) (date) Lecture Readi d ngs Assignment due
|
|
- Horatio Adams
- 5 years ago
- Views:
Transcription
1 Haptics & VR Syllabus Week (date) Lecture Readings Assignment due Week1 (9.2) Week2 (9.9) Overview 1. Introduction Week3 (9.16) 2. Input Devices summary 1 Readings proposalp Week4 (9.23) 3. Output Devices summary 2 Haptic VR project proposal Week5 (9.30) 4. Computing Architecture for VR summary 3 Graphics assignment1 Week6 (10.7) 5. Modeling ( 김기권 ) summary 4 Graphics assignment2 Week7 (10.14) 6. VR Programming ( 김기권 ) summary 5 Graphics assignment3 Week8 (10.21) 7. Human Factors in VR ( 이환문 ) summary 6 Graphics assignment4 Week9 (10.28) 8. Traditional VR Applications summary 7 Graphics assignment7 Week10 (11.4) 9. Emerging Applications of VR ( 이환문 ) summary 8 Graphics assignment8 Week11 (11.11) final exam Week12 (11.18) Lab1 Seminar1 Week13 (11.25) Lab2 Seminar2 Week14 (12.2) Lab3 Seminar3 Week15 (12.9) Lab4 Seminar4 Week16 (12.16) 16) Haptic VR project presentation Haptic VR project code & document 1
2 Output Devices: Graphics, 3-D Sound, and Haptic Displays 2
3 Output Devices The human senses need specialized interfaces Graphics displays for visual feedback; 3-D audio hardware for localized sound; Haptic interfaces for force and touch feedback; Not interested in smell and taste feedback. 3
4 Output Devices Definition: A graphics display is a computer interface that presents synthetic world images to one or several users interacting with the virtual world. 4
5 Output Devices Graphics Displays Human stereo viewing; Personal displays; Large volume displays: Active glasses Workbenches; Microsoft Surface Caves; Walls; 5
6 Output Devices Human Visual System Vision is the dominant sensorial channel; Depth perception in mono images is based on occlusion (one objects blocks another from view; on shadows, textures and motion parallax (closer images appear to move more than distant ones) Vision is the most powerful human sensorial channel 6
7 Human Visual System-continued Depth p perception p in stereo is based on seteropsis when the brain registers and fuses two images; Image parallax means that the two eyes register different images (horizontal shift); The amount of shift depends on the inter- pupillary distance (IPD) (varies for each person in the range of mm); Works W k in the near field (to a few meters from the eye) pupillary : 동공의, 눈동자의 7
8 Output Devices Field of view: one eye: 150 horizontally, 120 vertically both eyes: 180 horizontally, 120 vertically Binocullar overlap: 120 horizontally Motion parallax is important in monoscopic depth perception Parallax: 두눈의시차 ( 視差 ) 8
9 Output Devices (same principle used in new 3D HDTVs) Left eye image Right eye image 9
10 Output Devices Implications i for Stereo Viewing i devices Need to opese present two images of the same evr environment; The two images can be presented at the same time on two displays (HMD); The two images can also be presented time- sequenced on one display (active glasses; shutter glasses??); The h two images can also be presented spatially-sequenced on one display (autostereoscopic displays). 10
11 3D display / iki/3d Types of 3D displays Stereoscopic Stereoscopy Based on the principles of stereopsis, described by Sir Charles Wheatstone in the 1830s, stereoscopic technology uses a separate device for each person viewing i the scene to provide a different image to the person's left and right eyes. Examples of this technology include anaglyph images and polarized glasses. Stereoscopic technologies generally involve special spectacles. Autostereoscopic Autostereoscopy An evolutionary development of stereoscopy, autostereoscopic display technologies use optical trickery at the display, rather than worn by the user, to ensure that each eye sees the appropriate image. They generally allow the user to move their head a certain amount without destroying the illusion of depth. Automultiscopic displays include view-dependent pixels with different intensities and colors based on the viewing angle; this means that a number of different views of the same scene can be seen by moving horizontally around the display. In most automultiscopic displays the change of view is accompanied by the breakdown of the illusion of depth, but some displays exist which can maintain the illusion as the view changes [2]. This category of display technology includes autostereograms. Computer-generated holography Computer Generated Holography The hologram is a familiar artifact of the late 20th century, and research into holographic displays has produced devices which are able to create a light field identical to that which would emanate from the original scene, with both horizontal and vertical parallax across a large range of viewing angles. The effect is similar to looking through a window at the scene being reproduced; this may make CGH the most convincing of the 3D display technologies, but as yet the large amounts of calculation required to generate a detailed hologram largely prevent its application outside of the laboratory. Some companies do produce holographic imaging equipment commercially. [3] Volumetric displays Volumetric display In addition there are volumetric displays, where some physical mechanism is used to display points of light within a volume. Such displays use voxels instead of pixels. Volumetric displays include multiplanar displays, which have multiple display planes stacked up; and rotating panel displays, where a rotating panel sweeps out a volume. Other technologies have been developed to project light dots in the air above a device. An infrared laser is focused on the destination in space, generating a small bubble of plasma which emits visible light. As of August 2008, the experiments only allow a rate of 100 dots per second. One of the issues which arise with this filme 3D display system is the use of technologies that could be harmful to human eyes. 11
12 Output Devices Definition: Personal Displays A graphics display that toutputs t a virtual scene destined to be viewed by a single user. Such image may be monoscopic or stereoscopic, monocular (for a single eye) or binocular (displayed on both eyes). 12
13 Output Devices Personal Displays Head Mounted Displays; 3-D DBinoculars (hand supported); Booms (floor supported); Virtual windows (floor supported); Auto-stereoscopic displays (desk supported). 13
14 Simplified HMD optics model The lower the HMD resolution and the higher the FOV, the greater is the number of arc-minutes of eye view corresponding to each pixel. (granularity of HMD: expressed in arc-minutes/pixel) 14
15 HMD integration in a VR system Consumer HMD Professional HMD 15
16 AMLCD (active matrix liquid crystal display), Resolution: 267x225 FOV: 30x23 degrees Equivalent to 62 in at 2 m Weight: 100 grams Can be worn over glasses Olympus Eye Trek Face Mounted Display (FMD 200) 16
17 Sony Olympus Olympus Eye Trek Head Mounted Display Optics uses free-form lens to compensate for aberrations; - an eccentric optical system to reduce size (eliminate 45 degree mirror) 17
18 Olympus Eye Trek Face Mounted Display Optics 18
19 Daeyang cy-visor Face Mounted Display LCOS display, Resolution: 800x600 FOV: 60x43 degrees Weight: 160 grams Can be worn over glasses Liquid Crystal on Silicon display (LCOS) 19
20 Daeyang cy-visor visor Face Mounted Display It is reflective needs external lighting 20
21 Organic LEDs (OLED) In an active-matrix OLED display, each individual pixel can be addressed independently via the associated TFT s and capacitors in the electronic back plane. That is, each pixel element can be selected to stay on during the entire frame time, or duration of the video. Since OLED is an emissive device, the display aperture factor is not critical, unlike LCD displays where light must pass through aperture. Therefore, there are no intrinsic limitations to the pixel count, resolution, or size of an active-matrix OLED display, leaving the possibilities for commercial use open to our imagination. Also, because of the TFT s in the active-matrix design, a defective pixel produces only a dark effect, which is considered to be much less objectionable than a bright point defect, like found in LCD s. 21
22 Organic LEDs (OLED) Robust Design - OLED s are tough enough to use in portable devices such as cellular phones, digital video cameras, DVD players, car audio equipment and PDA s. Viewing Angles Can be viewed up to 160 degrees, OLED screens provide a clear and distinct image, even in bright light. High Resolution High information applications including videos and graphics, active-matrix OLED provides the solution. Each pixel can be turned on or off independently to create multiple colors in a fluid and smooth edged display. Electronic Paper OLED s are paper-thin. Due to the exclusion of certain hardware goods that normal LCD s require, OLED s are as thin as a dime. Production Advantages Up to 20% to 50% cheaper than LCD processes. Plastics will make the OLED tougher and more rugged. The future quite possibly could consist of these OLED s being produced like newspapers, rather than computer chips. Video Capabilities They hold the ability to handle streamlined video, which could revolutionize the PDA and cellular phone market. Hardware Content Lighter and faster than LCD s s. Can be produced out of plastic and is bendable. Also, OLED s do not need lamps, polarizers, or diffusers. Power Usage Takes less power to run (2 to 10 volts). 22
23 5DT Head Mounted Display 800x600 pixels 40 o diagonal view Organic LED Frame sequential stereo 600 grams $4k 23
24 Samsung Emagin z800 OLED HMD Weight 8 oz PC connection - USB, RGB input SVGA resolution (800x600 pixels) stereo Tracking degrees pan 60 degrees pitch $1200 USD OLED (Organic light-emitting diode) 24
25 Sensics pisight panoramic OLED HMDs 25
26 Sensics pisight panoramic HMDs Uses Organic LED A series of micro-displays with special optics to generate a panoramic view Weight 2 lbs (1 Kg) SVGA input resolution (2400x1729 pixels) Field of view 179 horizontal by 58 vertical Binocular overlap 82 Cost? USD sensics.com 26
27 Professional HMDs Keiser ProView AMLCD display, Resolution: 1024x768 FOV: 28x21 degrees Weight: 992 grams Even with the improved resolution, consumer FMDs cannot match the professional-grade LCD-based HMDs 27
28 Professional HMDs N-Vision Datavisor CRT display, Resolution: 1280x FOV: 78x39 degrees Weight: 1587 grams 28
29 LCOS (Liquid Crystal on Silicon display) Virtual Binoculars ( 가상쌍안경 ) 29
30 Virtual Binoculars 30
31 Floor-supported displays Boom3C (courtesy of Fakespace Labs.) 31
32 21 LCD display, Resolution: SXGA (1600x1200) Weight: Counter- Balanced; No dead space but High latencies due to Third-party tracker Virtual Window 3-D Display (courtesy of Virtual Research Co.) 32
33 Output Devices Auto-stereoscopic displays do d not require use of special glasses; Passive auto-stereoscopic displays do not track user s head and thus restrict user s position; Active auto-stereoscopic t displays track the head motion and give more freedom of motion. 33
34 d D Passive Auto-stereoscopic 3-D Display (Dimension Technologies Co.) The relation between the backlighting distance d and the distance to the user D determines a strea viewing cone 34
35 18.1 LCD display, Resolution: 1280x1024 (mono) 640 x 1024 (stereo) Weight: kg Passive Auto-stereoscopic 3-D Display (courtesy of Dimension Technologies Co.) 35
36 SynthaGram technology (StereoGraphics Co.) The lenticular screen is slanted The flat panel display is presented 9 images obtained by 9 virtual cameras The final image is produced in software by sub-pixel sampling and re-arranging the final image pixels in intimate juxtaposition with the optical elements of the lenticular screen The resultant image has a higher resolution in the horizontal direction The resultant image is sharper 36
37 40 LCD display, Resolution: 1280x768 pixels 70 o horizontal viewing (7 to 15 feet) Weight: 33.2 kg SynthaGram 404 (courtesy of StereoGraphics Co. - $12,000) 37
38 20 LCD display, Resolution: 1600x1200 (mono) 100 o horizontal viewing (1.5 to 6 feet) Weight: 8.4 kg SynthaGram 204 (courtesy of StereoGraphics Co. - $3,000) 38
39 18 LCD display, Resolution: 1280x1024 (mono) 640 x 1024 (stereo) Weight: 17 kg Active auto-stereoscopic 3-D Display (courtesy of Dresden 3D Co.) mechanical adaptation 39
40 TFT (thin film transistor) Active tracking accommodates 25 degrees Change in view direction electronical adaptation: AAC (auto adaptation coder) 40
41 Multi-user Auto-stereoscopic display How it works! The display redirects the appropriate frames to the right and left eye so that each eye can only see the relevant frame. The tracker locates each eye and sends the information to the control box. The control box would then tell the LCD screen what pixels to display. Through the optics system in the screen, the image will get directed through the thin film transistor (TFT) directly to the appropriate eye. A split second later it would do the same to the other eye. Hence, creating a 3D image. 41
42 Multi-user Auto-stereoscopic display Multiple users can be tracked simultaneously and more pixels can be opened up at any given time allowing light beams to be directed simultaneously to more than one eye and more than one 3D user. Position ii finders already track pupils of multiple l viewers with very small delay. Good resolution but still shows some flicker. OLED s becoming mainstream can help eliminate flicker What needs to be done? Better displays (100Hz 120Hz) Complete the multi-user concept 42
43 Multiple users auto-stereoscopic prototype 19 display from NextGen Technology 43
44 Sharp autostereoscopic laptop Pentium 4, 15 diagonal display, 1024x768 resolution, 2D and 3 D mode, uses parallax barrier. 44
45 Autostereoscopic cell phones! Ocuity (UK) and NEC make 2.5 diameter autostereoscopiccell phones. InTouch mobile handset (TTPCom) 2.1 Transflective 2D/3D TFT-LCD 132xRGBx176 pixel display Automatic control of 2D to 3D switching function Running TTPCom WGE 3D stereo game demonstration 45
46 Holographic displays The image source is based on standard flat panel technology of which the image is seen upon a nine optical layer glass panel. Objects will appear to float in space. For the maximum 3D effect, the background seen through the display should be several feet behind the display and dark in color. _ icrystal _ hd.pdf 46
47 Holographic displays EON TouchLight Bare-hand 3D interaction virtual reality display system VR scene can be zoomed, panned and rotated with both hands Uses image processing techniques to combine the output of two video cameras placed behind a semi-transparent plane in front of the user. Incorporated IR cameras and image processing board 47
48 48
49 Output Devices Large Volume Displays Allow several co-located users to view a monoscopic or stereoscopic view of the virtual world; Can be classified as monitor-based large volume displays or projector-based large volume displays. Allow more freedom of motion vs. personal displays. 49
50 Output Devices Monitor-based Large Volume Displays Use active or passive glasses; Several users can look at a monitor; Can C have a single monitor, or multiple li l side-byside monitors; If side-by-side, image continuity becomes an issue. 50
51 Untracked and wireless Tracked and wireless Refreshing screen at double the normal scan rate or 120~140 scans/sec IR controller directs orthochromatic liquid crystal shutters Active glasses 51
52 Output Devices Active glasses vs. FMDs Some advantages: no cables if head position is not tracked; light and ergonomic (can be used over vision glasses); work well with large volume displays; allows full screen resolution 1280x1024. Some disadvantages: lose 2/3 of image light intensity through LCD filtering; require special lcrt stereo ready that t has twice the hardware refresh rate (Hz) 120 Hz or more; require direct line of sight for IR controller; different viewing metaphor through the window. 52
53 Wireless old model Active glasses Wireless new model Wired to the synchronizing jack of the graphics card I-O Display Systems Inc. $99 vs. $1000 for StereoGraphics wireless glasses 53
54 Wired/wireless i glasses need a stereo enabler when connected to a VGA card without a 3-pin mini DIN output jack 54
55 Passive glasses vs. Active glasses How far you are intending to view the pictures from requires a certain separation between the cameras. This separation is called stereo base or stereo base line and results from the ratio of the distance to the image to the distance between your eyes. The mean interpupillary distance (IPD) is 63 mm (about 2.5 inches) (ex) image on the computer monitor from a distance of 1000: view ratio 1000/
56 Passive glasses vs. Active glasses 56
57 Passive glasses vs. Active glasses 57
58 Passive glasses vs. active glasses 58
59 Through the window metaphor The projection factor is changes by a factor K which such that exaggeration factor k = r (u U) + U Where: r is the responsiveness factor (optimally 1.25); u is the current head distance from the screen; U is the default distance (say 30 cm). Unfortunately t tracker jitter is amplified as well Better to exaggerate the image responseto the user s head motion 59
60 Active glasses system 60
61 Tiled monitors-based display VC 3.1 on book CD Resolution is 3840 x 1024 and dimensions are 1,1111 x029m
62 Non-synchronized tiled image discontinuity Synchronized tiled image 62
63 Output Devices Projector-based Large-Volume Displays Old technology is CRT-based (analog) three projector tubes (R, G, B); Requires special fast green coatingtoavoidthe to avoid the fogging due to fast switching (at 120 Hz); Suffer from low luminosity problems ( lumens) 63
64 Output Devices Projector-based Large-Volume Vl Displays Technology makes transition from CRT-based (analog) to Digital Micro-mirror Device (DMD) ) (digital) projectors; Workbench-type displays (Fakespace Responsive Workbench, Barco Baron, V-desk, etc.) Cave-type display (CAVE, RAVE) Wall-type displays Domes 64
65 Output Devices Digital Micro-mirror Device Display Light intensities are much larger than for CRT-based projectors 300 lumens to 1000 or more lumens Thus ambient light does not hinder image quality 65
66 Tilted surface Viewing Cone Reflector mirror Floor CRT projector (not shown) The old Fakespace ImmersaDesk workbench 66
67 IR Controllers CRT Projector Mirrors Tilting mechanism Baron workbench (courtesy of BARCO Co.) 67
68 Baron Workbench-type display geometries V-desk 68
69 CRT Projector Screen Mirror CAVE 3-D large volume display (courtesy of Fakespace Co.) 69
70 CAVE 3-D large volume display (courtesy of Fakespace Co.) 70
71 New types of stereo displays Such as BARCO Trace Driven by Barco Galaxy Stereo DLP projectors 3000 Lumens; 800:1 contrast ratio WARP geometry distortion for edge matching; 1400 x 1050 pixel resolution Barco Galaxy WARP: Barco integrates Real-Time 70 inch diagonal screen active stereo glasses Geometry Distortion into Active-Stereo DLP Projector 71
72 Microsoft SURFACE one large display (projector) five infrared cameras tracks k user s finger contact with the surface PC included in the enclosure 72
73 RAVE ( Re-configurable Virtual Environment ) Modular construction that allows various viewing configuration, from flat wall, to angled theater, to CAVE; Vertical wall image 2.3 m X 2.4 m; Several CRT projectors (260 lumens, 1280x1024 resolution); Takes 30 minutes or less to reconfigure 73
74 74
75 Output Devices Wall-type displays Accommodate more users Using a single projector on a large wall means small image resolution; Thus tiled displays place smaller images side-by-side i so they need multiple projectors; Images need to have overlap, to assure continuity; However overlap from two projectors means intensity discontinuity y( (brighter images in the overlap areas) Projectors need to modulate intensities to dim their light for overlap pixels. 75
76 Pano-Wall display Three projectors; Approx. 7 x 2 m 2 pp 76
77 PanoWall display 77
78 Output Devices 78
79 Tiled composite image from four projectors 79
80 Tiled composite image from four projectors after adjustment 80
81 Wll Wall and ddome-type displays Advantages: Accommodate more users (tens to hundreds) Give users more freedom of motion; Disadvantages: Large cost (up to millions of dollars); Even with multiple projectors, resolution is much lower than for CRTs (because the area is large). Example PanoWall has 200,000 pixels/m 2 while a monitor has 18,200,000 pixels/m 2 To have equal numbers of pixels/unit itare more projectors (military) 81
82 82
83 Output Devices 3-D Audio Displays Definition: Sound ddisplays are computer interfaces that provide synthetic sound feedback to the user interacting with the virtual world. The sound can be monoaural (both ears hear the same sound) or binaural (each ear hears a different sound). 83
84 Output Devices 3-D Audio Displays 3-D audio should not be confused with stereo sound; Human hearing model; HRTF(head-related transfer functions)-based 3-D sound; Convolvotron; 3-D sound cards. 84
85 Stereo vs. 3-D sound. 85
86 Output Devices Human Hearing Model Vertical-Polar coordinate system azimuth, elevation, distance (range); azimuth cues; elevation cues; Effect of pinna (outer ear); HRTFs 86
87 Output Devices Head Related Transfer Function (HRTF) 87
88 Output Devices 88
89 3-D Sound Effect of pinna filtering of sound (elevation and azimuth cues) 89
90 Output Devices NASA again a pioneer in 3-D sound Put microphones in dummy heads; Played localized sound and measured signal; Determined i d the HRTF; Worked on first circuitry; 90
91 3D sound localization 91
92 The Convolvotron PC 3-D sound boards. 92
93 The Huron workstation ti. 93
94 Output Devices 3-D Audio Displays 94
95 Sweet Spot Possible to create the illusion of many more phantom speakers surrounding the user and create effective azimuth localization. 95
96 Cross-talk effect Sound from one speaker reaches both ears: [][ [ H l,l H l,r ][ ] = Y left Y right l,l l,r H r,l H r,r S left S right where H l,l is the HRTF between the left speaker and the left ear, H l,r is the HRTF between the right speaker and the left ear, Y left is the sound reaching the left ear Y right is the sound reaching the right ear S left is the sound coming from the left loudspeaker S right is the sound coming from the right loudspeaker 96
97 Cross-talk effect cancellation Sound from both speakers e s is adjusted such that: -1 [][ S [ H l,l H l,r ] [ Y ] = S left S right H r,l H r,r Y left Y right where Y left and Y right are known (the output if the convolving process) 97
98 Commercial 3D Sound Cards What they have to offer: Digital Output Multi-speaker compatibility (7.1 channel format allows for 8 speakers) Positional Audio offers 3D dimensions of sound 98
99 Two Main Audio APIs DirectSound 3D (DS3D) Microsoft s DirectX component for positional audio Games Sound dcard and DS3D (Engines to Software Signals) Aureal 3D (A3D) Two Versions was very similar il to plain DS3D 2.0 could more accurately simulate how sound sources in a complex environment behave An extension of DS3D (by itself, its just reverb) PC environmental reverb standard created by Creative Lab All sound cards can have EAX capability EAX (Environmental Audio extensions) 99
100 EAX (Environmental Audio extensions) Multi-Environments Supports rendering of multiple simultaneous audio environments in real-time, enabling the creation of exceptionally realistic acoustic environments in games containing multiple locations/rooms where differences in size, texture and/or shape are present. Environment Panning Makes spatializing and localizing environments in 3D possible, providing new 3D gaming effects never heard before. Environment Reflections Offers localization of early reflections and echoes, bringing more detail and realism to 3D gaming. Environment Filtering Accurately simulates sound propagation in open and closed env. 100
101 EAX (Environmental Audio extensions) Environment Morphing Allows for seamless transition of audio from one environment to the next. Extreme Effects New EAX effects rendered dby an extended dreverberation algorithm that surpasses the complexity of many studio-based reverb processors. Enhanced 3D Audio Performance * Enhanced 3D audio performance through the correction of every positional delay and elimination of undesired audio artifacts (comb filtering) * Increased positional accuracy, especially elevation (vertical plane) that results in smoother movements and rotations. * HRTF filters and cross-talk cancellation algorithms for headphones, 2-speaker, 4-speaker, 5.1-speaker, 6.1-speaker and 7.1-speaker 101
102 Creative Labs Sound dbl Blaster Audigy 4 Pro High Definition Audio Quality for Playback and Recording Playback of 64 audio channels, each at an arbitrary sample rate; 24-bit Analog-to-Digital conversion of analog inputs at 96 khz sample rate 24-bit Digital-to-Analog conversion of digital sources at 96 khz to analog 7.1 speaker output 24-bit Digital-to-Analog it t l conversion of stereo digital it sources at 192 khz to stereo output t 16-bit to 24-bit recording sampling rates: 8, , 16, 22.05, 24, 32, 44.1, 48 and 96 khz. Supports Sony/Philips Digital Interface (SPDIF) format of up to 24-bit/96 khz quality. Selectable sampling rate of 44.1, 48 or 96 khz Low latency multitrack recording with ASIO 2.0 support at 16-bit, 48 khz and 24-bit, 96 khz resolution. 102
103 Turtle Beach Montego DDL 7.1 Sound Card Optical S/PDIF In/Out for playback of pure digital audio at resolutions of up to 24 Bits at 96kHz (out) sample rate playback and 16 Bits at 48kHz sample rate for recording (in). Allows for pass-through of Dolby Digital and DTS multi-channel DVD sound to external A/V receivers or digital it speaker systems Selectable 2, 4, 6 or 8 output channels with 24 bit playback at up to 96 khz sampling rate Converts stereo sound sources to multi-channel format, so you can listen to with enhanced multispeaker surround sound environment Pure surround sound and virtual surround sound from eight analog outputs provides 7.1 surround sound Supports Game Surround Sound APIs such as EAX 2.0 and A3D PCI Interface with bus mastering and burst modes 103
104 Commercial 3D Sound Cards Name Creative Sound Blaster Audigy 4 Pro Chip/3Dsound engine/api CA10200 ICT DSP/CreativeWa re/a3d 1.0, EAX Advanced HD 4 in/out 7.1-analog out 5.1-digital out (DIN) 2-digital in/out (Coaxial) 2-digital in/out optical ac3/dts passthru SRP $299 Philips Acoustic ThunderBird d 5.1-analog out $100 Edge Avenger/QSound 2-digital in/out (Coaxial) /A3D 1.0/EAX ac3/dtz pass-thru 2.0 Turtle Beach EAX 1 and 2, 7.1-analog out $80 Montego DDL 7.1 A3D, I3DL2 and DirectSound 3D Optical S/PDIF In/Out for audio resolutions 24 bit (out) 16 (in) sample rates 96kHz (out) and 48kHz (in)
105 Output Devices Haptics Haptic Interfaces Comes from Greek Hapthai meaning the sense of touch; Groups touch feedback and force feedback 105
106 Output Devices Touch Feedback Relies Rli on sensors in and close to the skin; Conveys y information on contact surface geometry, roughness, slippage, temperature; Does not actively resist user contact motion; Easier to implement than force feedback. 106
107 Output Devices Force Feedback Relies on sensors on muscle tendons and bones/joints proprioception; Conveys information on contact surface compliance, object weight, inertia; Actively Ati l resist ituser contact tmotion; More difficult to implement than touch feedback (no commercial products until mid 90s). 107
108 Haptic Interfaces Human touch sensing mechanism Most touch sensors are on the hand (much less density on other parts of the body); Four F primary types of sensors: 40 % are Meissner s corpuscles detect movement across the skin velocity detectors 25% are Merkel s disks measure pressure and vibrations 13 % are Pacinian corpuscles deeper in skin (dermis) acceleration sensors. Most sensitive to vibrations of about 250 Hz 19% are Rufini corpuscles detect skin shear and temperature changes 108
109 Haptic Interfaces Skin touch sensors 109
110 Haptic Interfaces Sensorial adaptation Measure the decrease in electrical signals from the skin sensor over time, for a constant stimulus; If the sensor produces a constant electrical discharge for a constant mechanical stimulus It is called Slow Adapting (SA); If the rate of electrical discharge drops rapidly over time for a constant stimulus called Rapidly Adapting (RA) 110
111 Haptic Interfaces Spatial resolution Measure the receptive field size of a sensor; If the sensor has a large receptive field it has low spatial resolution (Pacinian and Ruffini)SAII SA-II, RA-II If the receptive field is small has high spatial resolution (Meissner and Merkel) SA- I, RA-I 111
112 Haptic Interfaces Two-point limen test: 2.5 mm fingertip, 11 mm for palm, 67 mm for thigh 112
113 Haptic Interfaces 113
114 Haptic Interfaces Human grasping g configurations 114
115 Haptic Interfaces Maximum and sustained force exertion Maximum force exerted during power grasp Averages 400 N (male) and 225 N (female); Looking at body location, force output Grows G from 50 N at PIP finger joint, to 100 N at shoulder; Sustained force feedback is much smaller than maximum, owing to fatigue and pain 1N 은 1kg 의질량을가진물체를 1 m s2의가속도로가속시키는데필요한힘이다. 115
116 Haptic Interfaces Fatigue measured as a function of % Maximum Voluntary Contraction (MVC) and rest cycle 116
117 Haptic Interfaces Haptic feedback actuators Need to maximize power/weight ratio; Need to have high power/volume ratio; Need to have high bandwidth; Need d to have high hdynamic range (fidelity); Need to be safe for the user - None of the current actuator technology satisfies all these requirements 117
118 Haptic Interfaces Actuator comparison based on P/W ratio 118
119 Output Devices Touch hf Feedback ki Interfaces Can be desk-top or wearable (gloves); Touch feedback mouse; CyberTouch glove; Temperature feedback actuators; 119
120 Haptic Interfaces The ifeel Mouse (0-125 Hz). 120
121 Haptic Interfaces 6 individually Controlled Vibrotactile actuators Hz frequency 1.2 N amplitude at 125 Hz CyberTouch hgl Glove (Virtex) 121
122 Output Devices VC 3.3 on book CD 122
123 Output Devices Temperature feedback Added simulation realism by simulating surface thermal feel ; No moving gp parts; Uses thermoelectric pumps made of solid-state materials sandwiched between heat source and heat sink ; Single pump can produce 65 C differentials; 123
124 Temperature feedback actuator 124
125 User comfort zone C If system fails Heat travels back Through the pump And can burn skin Temperature feedback actuator control 125
126 Output Devices Force Feedback Interfaces Need mechanical grounding to resist user motion; Can C be grounded don desk, wall, or on user body; More difficult to construct and more expensive than tactile feedback interfaces 126
127 Haptic Interfaces 127
128 Uses potentiometers to sense position in spherical coordinates; Uses electrical actuators to apply resistive torques; Logitech Force feedback joystick 128
129 129
130 The PHANToM used for 3D sculpting (courtesy of SensAble Technology Co.) 130
131 131
132 PHANToM Omni 132
133 PHANToM Comparison 133
134 PHANToM 1.6/
135 Omega Haptic Device 3 DOF Force Feedback Delta Haptic Device 3 DOF Force Feedback Delta Haptic Device 6 DOF Force Feedback 3 DOF End-effector replaceable Resolution mm Max M continuous force 12 N Stiffness 14.5 N/mm Connectivity USB 2.0/PCI Full gravity compensation Real-time safety (velocity monitoring and electromagn. brakes) 3 DOF (36 cm diam 30 cm) Resolution.03 mm Max continuous force 20 N Stiffness 14.5 N/mm Connectivity PCI Real-time safety (velocity monitoring and electromagnetic brakes) 6 DOF (36 cm diam 30 cm) 20 o /axis Resolution.03 mm, 0.04 o Max M continuous force 20 N Torques continuous 0.2 Nm Stiffness 14.5 N/mm Connectivity y PCI Real-time safety (velocity monitoring and electromagnetic brakes) 135
136 NOVINT FALCON 3 DOF Force Feedback 3 DOF (right-left, lf forward/backward, d up/down) mm rumble, vibrations 3D exploration and textures t Dynamic effects (inertia, weight, momentum) Cost less than $300 Resolution >0.06 mm Max continuous force 10 N Stiffness 5N/mm Connectivity USB khz control bandwidth 136
137 3 DOF cylindrical robot The Haptic Master Max force output 250 N Stiffness 50 N/mm Uses force-in, position-out arrangement 137
138 Exoskeleton over CyberGlove) Cables and pulleys 16 N/finger (continuous?); Weight 539 grams; remote electrical actuators in a control box. The CyberGrasp force feedback glove 138
139 The CyberGrasp force feedback glove VC 3.4 on book CD 139
140 CyberGrasp glove Electronic interface box Tether Wrist Tracker The CyberPack (courtesy of Virtex Co.) 140
141 6 DOF mechanical arm Wrist position and Force feedback No need for a tracker Allows simulation of weight and inertia, not possible with glove-only only interfaces CyberForce interface (introduced recently) 141
142 Haptic Interfaces CyberForce interface VC 3.5 on book CD 142
143 143
VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.
VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D
More informationUniversidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - I
Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Output Devices - I Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos What is Virtual Reality? A high-end user
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationThe Science Seeing of process Digital Media. The Science of Digital Media Introduction
The Human Science eye of and Digital Displays Media Human Visual System Eye Perception of colour types terminology Human Visual System Eye Brains Camera and HVS HVS and displays Introduction 2 The Science
More informationOutput Devices - Visual
IMGD 5100: Immersive HCI Output Devices - Visual Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Overview Here we are concerned with technology
More informationCSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis
CSC Stereography Course 101... 3 I. What is Stereoscopic Photography?... 3 A. Binocular Vision... 3 1. Depth perception due to stereopsis... 3 2. Concept was understood hundreds of years ago... 3 3. Stereo
More informationUniversidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - II
Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Output Devices - II Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos The human senses need specialized interfaces
More informationCSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS
CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start
More informationVirtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21
Virtual Reality I Visual Imaging in the Electronic Age Donald P. Greenberg November 9, 2017 Lecture #21 1968: Ivan Sutherland 1990s: HMDs, Henry Fuchs 2013: Google Glass History of Virtual Reality 2016:
More information3D Interfaces Output. The human senses need specialized interfaces: Outline. 3D User interfaces Output. Output Devices.
Outline 3D Interfaces Output Paulo Dias Interactive Computing Systems MAPi Doctoral Programme (06/01/2014) z 3D User interfaces Output Vision / Graphic displays Introduction Stereopsis Personal and Large
More informationSound source localization and its use in multimedia applications
Notes for lecture/ Zack Settel, McGill University Sound source localization and its use in multimedia applications Introduction With the arrival of real-time binaural or "3D" digital audio processing,
More informationG-700 multiple Channel 4K Curve Edge Blending Processor
G-700 multiple Channel 4K Curve Edge Blending Processor G-700 is a curved screen edge blending processor with the ability to provide multiple processing modules to control from 1 to 4 projectors based
More informationPeter Berkelman. ACHI/DigitalWorld
Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash
More informationProprioception & force sensing
Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka
More informationPerception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision
11-25-2013 Perception Vision Read: AIMA Chapter 24 & Chapter 25.3 HW#8 due today visual aural haptic & tactile vestibular (balance: equilibrium, acceleration, and orientation wrt gravity) olfactory taste
More informationThis technical brief provides detailed information on the image quality, performance, and versatility of Epson projectors.
This technical brief provides detailed information on the image quality, performance, and versatility of Epson projectors. Superior Brightness All Epson multimedia projectors include Epson s integrated
More informationCSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2
CSE 165: 3D User Interaction Lecture #7: Input Devices Part 2 2 Announcements Homework Assignment #2 Due tomorrow at 2pm Sony Move check out Homework discussion Monday at 6pm Input Devices CSE 165 -Winter
More informationOmni-Directional Catadioptric Acquisition System
Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time
More informationLecture 7: Human haptics
ME 327: Design and Control of Haptic Systems Winter 2018 Lecture 7: Human haptics Allison M. Okamura Stanford University types of haptic sensing kinesthesia/ proprioception/ force cutaneous/ tactile Related
More informationVendor Response Sheet Technical Specifications
TENDER NOTICE NO: IPR/TN/PUR/TPT/ET/17-18/38 DATED 27-2-2018 Vendor Response Sheet Technical Specifications 1. 3D Fully Immersive Projection and Display System Item No. 1 2 3 4 5 6 Specifications A complete
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationWhat is a digital image?
Lec. 26, Thursday, Nov. 18 Digital imaging (not in the book) We are here Matrices and bit maps How many pixels How many shades? CCD Digital light projector Image compression: JPEG and MPEG Chapter 8: Binocular
More informationSensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.
Sensing Key requirement of autonomous systems. An AS should be connected to the outside world. Autonomous systems Convert a physical value to an electrical value. From temperature, humidity, light, to
More informationEinführung in die Erweiterte Realität. 5. Head-Mounted Displays
Einführung in die Erweiterte Realität 5. Head-Mounted Displays Prof. Gudrun Klinker, Ph.D. Institut für Informatik,Technische Universität München klinker@in.tum.de Nov 30, 2004 Agenda 1. Technological
More informationLecture 1: Introduction to haptics and Kinesthetic haptic devices
ME 327: Design and Control of Haptic Systems Winter 2018 Lecture 1: Introduction to haptics and Kinesthetic haptic devices Allison M. Okamura Stanford University today s objectives introduce you to the
More informationOutput Devices - Non-Visual
IMGD 5100: Immersive HCI Output Devices - Non-Visual Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Overview Here we are concerned with
More informationKinect Interface for UC-win/Road: Application to Tele-operation of Small Robots
Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots Hafid NINISS Forum8 - Robot Development Team Abstract: The purpose of this work is to develop a man-machine interface for
More informationLaser Scanning 3D Display with Dynamic Exit Pupil
Koç University Laser Scanning 3D Display with Dynamic Exit Pupil Kishore V. C., Erdem Erden and Hakan Urey Dept. of Electrical Engineering, Koç University, Istanbul, Turkey Hadi Baghsiahi, Eero Willman,
More informationWhat is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology
Virtual Reality man made reality sense world What is Virtual Reality? Dipl-Ing Indra Kusumah Digital Product Design Fraunhofer IPT Steinbachstrasse 17 D-52074 Aachen Indrakusumah@iptfraunhoferde wwwiptfraunhoferde
More informationHaptic Rendering and Volumetric Visualization with SenSitus
Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,
More informationRange Sensing strategies
Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called
More informationRobot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology
Robot Sensors 2.12 Introduction to Robotics Lecture Handout September 20, 2004 H. Harry Asada Massachusetts Institute of Technology Touch Sensor CCD Camera Vision System Ultrasonic Sensor Photo removed
More informationHMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University
HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive
More informationAn Introduction into Virtual Reality Environments. Stefan Seipel
An Introduction into Virtual Reality Environments Stefan Seipel stefan.seipel@hig.se What is Virtual Reality? Technically defined: VR is a medium in terms of a collection of technical hardware (similar
More informationIntro to Virtual Reality (Cont)
Lecture 37: Intro to Virtual Reality (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A Overview of VR Topics Areas we will discuss over next few lectures VR Displays VR Rendering VR Imaging CS184/284A
More informationWhat is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel
An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel stefan.seipel@hig.se VR is a medium in terms of a collection of technical hardware (similar
More informationMEASURING HEAD-UP DISPLAYS FROM 2D TO AR: SYSTEM BENEFITS & DEMONSTRATION Presented By Matt Scholz November 28, 2018
MEASURING HEAD-UP DISPLAYS FROM 2D TO AR: SYSTEM BENEFITS & DEMONSTRATION Presented By Matt Scholz November 28, 2018 Light & Color Automated Visual Inspection Global Support TODAY S AGENDA The State of
More informationFeeding human senses through Immersion
Virtual Reality Feeding human senses through Immersion 1. How many human senses? 2. Overview of key human senses 3. Sensory stimulation through Immersion 4. Conclusion Th3.1 1. How many human senses? [TRV
More informationImages and Displays. Lecture Steve Marschner 1
Images and Displays Lecture 2 2008 Steve Marschner 1 Introduction Computer graphics: The study of creating, manipulating, and using visual images in the computer. What is an image? A photographic print?
More informationGeog183: Cartographic Design and Geovisualization Spring Quarter 2018 Lecture 2: The human vision system
Geog183: Cartographic Design and Geovisualization Spring Quarter 2018 Lecture 2: The human vision system Bottom line Use GIS or other mapping software to create map form, layout and to handle data Pass
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Virtual Reality Display Systems VR display systems Morton Heilig began designing the first multisensory virtual experiences in 1956 (patented in 1961): Sensorama
More informationREPLICATING HUMAN VISION FOR ACCURATE TESTING OF AR/VR DISPLAYS Presented By Eric Eisenberg February 22, 2018
REPLICATING HUMAN VISION FOR ACCURATE TESTING OF AR/VR DISPLAYS Presented By Eric Eisenberg February 22, 2018 Light & Color Automated Visual Inspection Global Support TODAY S AGENDA Challenges in Near-Eye
More informationSurround: The Current Technological Situation. David Griesinger Lexicon 3 Oak Park Bedford, MA
Surround: The Current Technological Situation David Griesinger Lexicon 3 Oak Park Bedford, MA 01730 www.world.std.com/~griesngr There are many open questions 1. What is surround sound 2. Who will listen
More informationVisual Perception. human perception display devices. CS Visual Perception
Visual Perception human perception display devices 1 Reference Chapters 4, 5 Designing with the Mind in Mind by Jeff Johnson 2 Visual Perception Most user interfaces are visual in nature. So, it is important
More informationWhat is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments
An Introduction into Virtual Reality Environments What is Virtual Reality? Technically defined: Stefan Seipel, MDI Inst. f. Informationsteknologi stefan.seipel@hci.uu.se VR is a medium in terms of a collection
More informationVR based HCI Techniques & Application. November 29, 2002
VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted
More informationFORCE FEEDBACK. Roope Raisamo
FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces
More informationMRT: Mixed-Reality Tabletop
MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having
More informationAural and Haptic Displays
Teil 5: Aural and Haptic Displays Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Aural Displays Haptic Displays Further information: The Haptics Community Web Site: http://haptic.mech.northwestern.edu/
More informationCS-2000/2000A. Spectroradiometer NEW
Spectroradiometer NEW CS-000/000A The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. World's top level capability to detect extremely low
More informationA Hybrid Immersive / Non-Immersive
A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain
More informationFrom Encoding Sound to Encoding Touch
From Encoding Sound to Encoding Touch Toktam Mahmoodi King s College London, UK http://www.ctr.kcl.ac.uk/toktam/index.htm ETSI STQ Workshop, May 2017 Immersing a person into the real environment with Very
More informationLecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)
Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces
More informationSpatial Audio & The Vestibular System!
! Spatial Audio & The Vestibular System! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 13! stanford.edu/class/ee267/!! Updates! lab this Friday will be released as a video! TAs
More informationDigitizing Color. Place Value in a Decimal Number. Place Value in a Binary Number. Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally
Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally Fluency with Information Technology Third Edition by Lawrence Snyder Digitizing Color RGB Colors: Binary Representation Giving the intensities
More informationCSC 170 Introduction to Computers and Their Applications. Lecture #3 Digital Graphics and Video Basics. Bitmap Basics
CSC 170 Introduction to Computers and Their Applications Lecture #3 Digital Graphics and Video Basics Bitmap Basics As digital devices gained the ability to display images, two types of computer graphics
More informationForce feedback interfaces & applications
Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,
More informationAuditory Localization
Auditory Localization CMPT 468: Sound Localization Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University November 15, 2013 Auditory locatlization is the human perception
More information5/17/2009. Digitizing Color. Place Value in a Binary Number. Place Value in a Decimal Number. Place Value in a Binary Number
Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally Digitizing Color Fluency with Information Technology Third Edition by Lawrence Snyder RGB Colors: Binary Representation Giving the intensities
More informationIntroducing Twirling720 VR Audio Recorder
Introducing Twirling720 VR Audio Recorder The Twirling720 VR Audio Recording system works with ambisonics, a multichannel audio recording technique that lets you capture 360 of sound at one single point.
More informationHaptic interaction. Ruth Aylett
Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration
More informationTAKING GREAT PICTURES. A Modest Introduction
TAKING GREAT PICTURES A Modest Introduction HOW TO CHOOSE THE RIGHT CAMERA EQUIPMENT WE ARE NOW LIVING THROUGH THE GOLDEN AGE OF PHOTOGRAPHY Rapid innovation gives us much better cameras and photo software...
More informationAchieving 100,000 : 1 contrast measurement
NEW Spectroradiometer Highly precise spectral radiance/chromaticity measurement possible from 0.003 cd/m 2 Achieving 100,000 : 1 contrast measurement World's top level capability to detect extremely low
More informationIntroduction to Virtual Reality (based on a talk by Bill Mark)
Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers
More informationDLA-RS20 Full HD D-ILA Front Projector
DLA-RS20 Full HD D-ILA Front Projector With the new DLA-RS20, the bigger the screen, the better the picture. Now every seat in your mini theater is the best in the house! The new DLA-RS20 brings THX certified
More informationHolographic 3D imaging methods and applications
Journal of Physics: Conference Series Holographic 3D imaging methods and applications To cite this article: J Svoboda et al 2013 J. Phys.: Conf. Ser. 415 012051 View the article online for updates and
More information1.6 Beam Wander vs. Image Jitter
8 Chapter 1 1.6 Beam Wander vs. Image Jitter It is common at this point to look at beam wander and image jitter and ask what differentiates them. Consider a cooperative optical communication system that
More informationGetting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes
CS559 Lecture 2 Lights, Cameras, Eyes Last time: what is an image idea of image-based (raster representation) Today: image capture/acquisition, focus cameras and eyes displays and intensities Corrected
More informationHuman Senses : Vision week 11 Dr. Belal Gharaibeh
Human Senses : Vision week 11 Dr. Belal Gharaibeh 1 Body senses Seeing Hearing Smelling Tasting Touching Posture of body limbs (Kinesthetic) Motion (Vestibular ) 2 Kinesthetic Perception of stimuli relating
More informationHead Mounted Display Optics II!
! Head Mounted Display Optics II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 8! stanford.edu/class/ee267/!! Lecture Overview! focus cues & the vergence-accommodation conflict!
More informationCS148: Introduction to Computer Graphics and Imaging. Displays. Topics. Spatial resolution Temporal resolution Tone mapping. Display technologies
CS148: Introduction to Computer Graphics and Imaging Displays Topics Spatial resolution Temporal resolution Tone mapping Display technologies Resolution World is continuous, digital media is discrete Three
More informationA Digital Camera Glossary. Ashley Rodriguez, Charlie Serrano, Luis Martinez, Anderson Guatemala PERIOD 6
A Digital Camera Glossary Ashley Rodriguez, Charlie Serrano, Luis Martinez, Anderson Guatemala PERIOD 6 A digital Camera Glossary Ivan Encinias, Sebastian Limas, Amir Cal Ivan encinias Image sensor A silicon
More information2 Outline of Ultra-Realistic Communication Research
2 Outline of Ultra-Realistic Communication Research NICT is conducting research on Ultra-realistic communication since April in 2006. In this research, we are aiming at creating natural and realistic communication
More informationDevelopment of intelligent systems
Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic
More informationCamera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital
More informationCOPYRIGHTED MATERIAL. Overview
In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated
More informationA Low Cost Optical See-Through HMD - Do-it-yourself
2016 IEEE International Symposium on Mixed and Augmented Reality Adjunct Proceedings A Low Cost Optical See-Through HMD - Do-it-yourself Saul Delabrida Antonio A. F. Loureiro Federal University of Minas
More informationVirtual and Augmented Reality Applications
Department of Engineering for Innovation University of Salento Lecce, Italy Augmented and Virtual Reality Laboratory (AVR Lab) Keynote Speech: Augmented and Virtual Reality Laboratory (AVR Lab) Keynote
More informationCOPYRIGHTED MATERIAL OVERVIEW 1
OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,
More informationPutting It All Together: Computer Architecture and the Digital Camera
461 Putting It All Together: Computer Architecture and the Digital Camera This book covers many topics in circuit analysis and design, so it is only natural to wonder how they all fit together and how
More informationVirtual Mix Room. User Guide
Virtual Mix Room User Guide TABLE OF CONTENTS Chapter 1 Introduction... 3 1.1 Welcome... 3 1.2 Product Overview... 3 1.3 Components... 4 Chapter 2 Quick Start Guide... 5 Chapter 3 Interface and Controls...
More informationHaptic Holography/Touching the Ethereal
Journal of Physics: Conference Series Haptic Holography/Touching the Ethereal To cite this article: Michael Page 2013 J. Phys.: Conf. Ser. 415 012041 View the article online for updates and enhancements.
More informationHaptic holography/touching the ethereal Page, Michael
OCAD University Open Research Repository Faculty of Design 2013 Haptic holography/touching the ethereal Page, Michael Suggested citation: Page, Michael (2013) Haptic holography/touching the ethereal. Journal
More information4K Resolution, Demystified!
4K Resolution, Demystified! Presented by: Alan C. Brawn & Jonathan Brawn CTS, ISF, ISF-C, DSCE, DSDE, DSNE Principals of Brawn Consulting alan@brawnconsulting.com jonathan@brawnconsulting.com Sponsored
More informationHeroX - Untethered VR Training in Sync'ed Physical Spaces
Page 1 of 6 HeroX - Untethered VR Training in Sync'ed Physical Spaces Above and Beyond - Integrating Robotics In previous research work I experimented with multiple robots remotely controlled by people
More informationCSE 190: 3D User Interaction
Winter 2013 CSE 190: 3D User Interaction Lecture #4: Displays Jürgen P. Schulze, Ph.D. CSE190 3DUI - Winter 2013 Announcements TA: Sidarth Vijay, available immediately Office/lab hours: tbd, check web
More informationIMGD 3xxx - HCI for Real, Virtual, and Teleoperated Environments: Human Hearing and Audio Display Technologies. by Robert W. Lindeman
IMGD 3xxx - HCI for Real, Virtual, and Teleoperated Environments: Human Hearing and Audio Display Technologies by Robert W. Lindeman gogo@wpi.edu Motivation Most of the focus in gaming is on the visual
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationApplications of Optics
Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics
More informationIdeal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs.
2D Color Analyzer 8 Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs. Accurately and easily measures the distribution of luminance and chromaticity. Advanced
More informationTouch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics
Touch & Haptics Touch & High Information Transfer Rate Blind and deaf people have been using touch to substitute vision or hearing for a very long time, and successfully. OPTACON Hong Z Tan Purdue University
More informationMarco TRS Total Refraction System
Marco TRS-5100 Total Refraction System TRS-5100: Total Refraction System The TRS has a forehead position detector (blue LED) for reliable vertex measurements. Wide Visual Field (40 ) apertures provide
More informationCamera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy
Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital
More informationName Digital Imaging I Chapters 9 12 Review Material
Name Digital Imaging I Chapters 9 12 Review Material Chapter 9 Filters A filter is a glass or plastic lens attachment that you put on the front of your lens to protect the lens or alter the image as you
More informationHaptic interaction. Ruth Aylett
Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration
More information- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture
12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used
More informationVirtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015
Virtual Reality Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015 Virtual Reality What is Virtual Reality? Virtual Reality A term used to describe a computer generated environment which can simulate
More informationDigital Photographic Imaging Using MOEMS
Digital Photographic Imaging Using MOEMS Vasileios T. Nasis a, R. Andrew Hicks b and Timothy P. Kurzweg a a Department of Electrical and Computer Engineering, Drexel University, Philadelphia, USA b Department
More informationPRELIMINARY INFORMATION
SMP- 1-PRO & UMP- 1 SMP- 1 - Shuffling microphone preamp The SMP-1 from phædrus is a high-quality, microphone and instrumental preamplifier featuring: transformer coupling (in and out); 70dB of gain; passive,
More information