Remote Driving With a Multisensor User Interface

Size: px
Start display at page:

Download "Remote Driving With a Multisensor User Interface"

Transcription

1 Remote Driving With a Multisensor User Interface Copyright 2000 Society of Automotive Engineers, Inc. Gregoire Terrien Institut de Systèmes Robotiques, L Ecole Polytechnique Fédérale de Lausanne Terrence Fong and Charles Thorpe The Robotics Institute, Carnegie Mellon Univ. Charles Baur Institut de Systèmes Robotiques, L Ecole Polytechnique Fédérale de Lausanne ABSTRACT Remote driving is a difficult task, primarily because operators have problems understanding the remote environment and making control decisions. To make remote driving easier and more productive, we are developing sensor fusion techniques using sensors to build active, sensor fusion based interfaces. In our work, we use sensor fusion to facilitate human perception and to enable efficient command generation. In this paper, we describe a multisensor user interface for remote driving. INTRODUCTION Perhaps the most difficult aspect of remote driving is that the operator is unable to directly perceive the remote environment. Instead, he is forced to rely on sensors, bandwidth-limited communications links, and an interface to provide him with information. As a result, the operator often fails to understand the remote environment and makes judgements errors. Thus, to make remote driving easier, we need to find ways to facilitate information transfer and to improve situational awareness. Our approach is to develop sensor fusion based operator interfaces. Sensor fusion has long been used to improve automatic processes such as mapping. However, we believe that sensor fusion can also be used to create capable and compelling interfaces. In particular, we are using sensor fusion to: (1) create displays which enable better understanding of the remote environment; and (2) efficiently and accurately generate motion commands. In [1], we described a system for fusing and displaying stereo vision and sonar data. In this paper, we describe extensions to our previous work. Specifically, we discuss the addition of a scanning laser finder (), integration of the multisensor platform with a mobile robot, and the development of command generation tools. RELATED RESEARCH SENSOR FUSION DISPLAYS Sensor fusion displays combine information from multiple sensors or data sources to present a single, integrated view. Sensor fusion displays are important for applications in which the operator must rapidly process large amounts of multi-spectral or dynamically changing heterogeneous data. In military aerospace, sensor fusion displays combine information from imaging sensors (visible light cameras, night-vision devices, millimeter wave radar, thermal imagers, etc.) and databases (digital maps, target catalogs). The resultant displays are used to improve cockpit efficiency during target acquisition or tracking, tasks which demand high-levels of situation awareness and cognitive decision making[6]. In civil air transport, sensor fusion displays are being considered for use in enhanced or synthetic vision systems. These displays would enable pilots to better detect runway features and incursions during landing, and would aid in detecting obstacles and traffic in taxi[5]. Sensor fusion displays would also enable airport traffic controllers to operate in low-visibility weather conditions, i.e., the sensors would allow controllers to see aircraft movements through fog or cloud.[4]. More recently, sensor fusion displays have been used as control interfaces for telerobots. In particular, the Virtual Environment Vehicle Interface (VEVI) combines data from a variety of sensors (stereo video,, GPS, inclinometers, etc.) to create an interactive, graphical 3D representation of the robot and its environment. For example, multiple types of data are used to construct polygonal models of explored terrain[3].

2 AUGMENTED REALITY Augmented reality is a variation of Virtual Environments (VE), otherwise known as Virtual Reality. Whereas VE s completely replace reality by immersing the user inside a synthetic environment, augmented reality allows the user to see the real world (often using a head-mounted, seethrough display) in combination with information superimposed or composite on the display. Thus, augmented reality enhances a user s perception of and experience with the real world[8]. Augmented reality has been used to assist users in outdoor navigation tasks. In [11], a wearable computer system and see-through display provide location-specific multimedia information to Columbia University visitors. Differential GPS (dgps) and a magnetometer/inclinometer are used to track a user s location and to update the display with information such as current location, nearby points of interest and point-to-point directions. Similarly, in [9], dgps and a digital compass are used to create graphical overlays in a see-through display to assist users in large-area terrestrial navigation. Augmented reality has also been used to assist remote collaboration. In [10], a collaborative system provides a local user with direct, unmediated access to the output of sensors (proximity, location, electronic tags) attached to a remote user s wearable computer. This allows the local user to provide context-specific information to the remote user. For example, an expert is able to provide a field technician with supplemental information (schematics, specifications, etc.) based on the sensed location. SYSTEM CONFIGURATION PREVIOUS WORK We previously developed a multisensor system incorporating stereo vision, an ultrasonic sonar ring, and odometry[1][2]. We chose these sensors based on their complementary characteristics: stereo vision provides passive wide-area ranging with high angular resolution but with non-linear depth measurements; sonar provides active sector-based ranging with linear depth measurements but with low angular resolution. We used the multisensor system to create two sensor fusion displays: a 2D image with color overlay and a local map. The 2D image showed data as colors overlaid on a monochrome 2D camera image. We found that this display helped users make relative distance judgements and to spot nearby obstacles. The local map was constructed via histogramic filtering of the data into an occupancy grid. With the high angular resolution from stereo, object contours were clearly visible and the map improved users situational awareness. The system, however, had two significant weaknesses. First, we found that the sonar/stereo combination failed in certain environments. For example, smooth surfaces with low texture (e.g., a white wall) were frequently missed by both the sonars (specular reflection) and the stereo (poor correlation). Second, the sensors were mounted on an electric wheelchair equipped with wheel encoders. While this enabled us to show that sensor fusion displays improved understanding of the remote environment, it did not allow us to use the displays for command generation (i.e., users were unable to teleoperate the wheelchair). LADAR INTEGRATION To address our system s sensing inadequacies, we added a Sick Proximity Laser Scanner (PLS) to the sensor suite[7]. Ladar sensors provide precise measurement with very high angular resolution, but are usually limited to a narrow horizontal band (i.e., a halfplane). This forms a good complement to the sonar and stereo sensors, which are less accurate but have a broader field-of-view. The PLS has 5 cm accuracy over a wide (20cm to 50m), a 180 degree horizontal field- of-view (361 discrete measurements) and greater than 5 Hz scan rate. We should note that is not, by itself, a panacea. Although this sensor provides high resolution ranging independent of surface type and is well suited for mapping certain environments (e.g., indoor office walls), it also has limitations. Since the measures depth in a narrow horizontal band, obstacles which do not intersect the scanning plane will be missed. Moreover, obstacles with varying vertical profiles (e.g., a table) will not be correctly measured (e.g., only the table legs may be scanned). Additionally, both smoke and steam reduce beam intensity, thus producing erroneously large measurements (due to weak return) or complete failure (when the reflected signal is too weak to be detected). Finally, is susceptible to glare which often makes it problematic for outdoor use. Figure 1 shows our current multisensor system. The ultrasonic sonar ring uses Polaroid 600 series electrostatic transducers and provides time-of-flight at 25Hz. The stereo vision system is a Small Vision Module[12] and produces 2D intensity (monochrome) images and 3D (disparity) images at 5 Hz. Odometry is obtained from wheel-mounted optical encoders.

3 stereo vision system WaveLan). On-board computing is used for sensor management (data collection and transmission), obstacle avoidance, and position servoing. laser scanner sonars The obstacle avoidance algorithm differs from conventional methods because it does not perturb the robot s trajectory near obstacles. Instead, the algorithm scans for obstacles in the direction of motion and slows or stops the robot s translation and rotation rates[7]. This allows the robot to avoid collisions without sudden directional changes which would confuse the operator. USER INTERFACE Figure 1. Multisensor system Table 1 lists situations commonly encountered in indoor vehicle teleoperation. Although no individual sensor works in all situations, the collection of sensors provides complete coverage. Table 1. Sensor performance in teleoperation situations Situation 2D Image (intensity) 3D Image (disparity) Pioneer2 DX Sonar Ladar (laser) We developed a remote driving interface which contains sensor fusion displays and a variety of command generation tools. The interface is designed to improve situational awareness, facilitate depth judgement, support decision making and speed command generation. In particular, we placed considerable emphasis on creating effective affordances and representations so that data is readily accessible and understandable. Additionally, since the operator has to generate remote driving commands by analyzing what is displayed on the screen, we tried to provide an interface which is intuitive, coherent, and maximizes information transfer. smooth surfaces (no visual texture) OK Fails a Fails b OK image display map display rough surface (little/no texture) OK Fails a OK OK far obstacle (> 10m) Fails c Fails d Fails e OK close obstacle (< 0.5m) small obstacle (on the ground) dark environment (no ambient light) OK f Fails g OK h OK i Fails c OK OK Fails j Fails Fails OK OK a. no correlation f. limited by focal length b. specular reflection g. high disparity c. no depth measurement h. limited by transceiver d. poor resolution i. limited by receiver e. echo not received j. outside of scan plane ROBOT INTEGRATION To investigate remote driving, we mounted the sensors on a Pioneer2 1 DX (P2DX) mobile robot (Figure 1). The P2DX has differential drive and is designed for indoor environments. We equipped the P2DX with on-board computing (Ampro P5e) and wireless ethernet (Lucent 1. Pioneer is a trademark of ActivMedia, Inc. motion pad Figure 2. Sensor fusion user interface for remote driving Figure 2 shows the main window of our sensor fusion based user interface. The interface contains three primary tools: the image display, the motion pad, and the map display. In addition, to enable the operator to better understand the remote environment and to better make decisions, we developed tools for measuring distance, checking clearance, and for finding correspondences between map and image points.

4 The image display contains a monochrome video image with a color overlay to improve depth judgement and obstacle/hazard detection. Hue values encode depth information from close (yellow) to far (blue). Since close depth is more relevant (e.g., for identifying and avoiding nearby obstacles), we vary hue exponentially (i.e., near s are encoded with more values than far s). Motion Pad The motion pad enables the operator to directly control the robot. Clicking on the vertical axis commands a forward/reverse translation rate. Clicking on the horizontal axis commands a rotation rate. Translation and rotation are independent, thus the operator can simultaneously control both by clicking off-axis. The pad s border color indicates the robot s status (moving, stopped, etc.). To navigate the robot, we created a map display which gives the operator with a bird s eye view of the remote environment. The display is constructed as the robot moves and shows sensed environment features and the robot s path. The map display provides a local and a global map. With the local map, the user can precisely navigate through complex spaces. For large-area navigation, the global map helps maintain situational awareness by showing where the robot has been. Interface rate command pose command image display map display An event monitor watches for critical system events and mode changes (e.g., obstacle avoidance in progress). It also continually monitors robot health and generates appropriate status messages to be displayed to the user. SENSOR FUSION ALGORITHMS path servo event monitor obstacle avoidance Figure 3. System architecture Motors Sensors odometry sonar stereo Robot State We create the image display by overlaying information as colors on a monochrome image taken from one of the stereo cameras. This method does not provide an absolute indication of. However, we find it greatly improves relative distance judgement. At any time, the operator can annotate the global map by adding comments or drawing virtual obstacles. For example, if the operator finds something of interest, he can label the map. Additionally, if he decides that a particular region is dangerous, he can draw an artificial barrier on the map and the robot s obstacle avoidance will keep the robot from entering the region (see Figure 8). MODULES AND DATA FLOW The system architecture is shown in Figure 3. The robot is driven by rate or pose commands generated by the interface. Pose commands are processed by a path servo which generates a smooth trajectory from the current position to the target pose. All motion commands are constrained by the obstacle avoidance module. All sensors are continuously read on-board the robot and the data transmitted to the interface. The sensor readings are used to update the image and map displays. Fusion algorithms for both displays are described in the following sections. display sonars yes < 0.6 m sonar no display stereo Figure 4. Image display processing display glare Figure 4 shows how the image display is constructed. For each overlay pixel, the sonar is used to decide whether to display sonar or stereo data. When the sonar no

5 is low, sonar data is used because stereo correlation fails when objects are too close. Otherwise, if the sonar is high, stereo is displayed. In addition, because the is precise and reliable in an office environment, we always overlay when available (i.e., unless the detects glare). We create the map display using sensor data and vehicle odometry for registration. We currently do not perform any filtering or adaptive registration of sensor data. Thus, map accuracy is highly correlated with odometry errors. The interface allows the user to select which sensors are used for map building at any time. sonar stereo interface local map Stereo only Sonar and stereo Sonar only Ladar only annotation tool annotation tool Figure 5 shows how the map is constructed. The local map only shows current sensor data in proximity to the robot. Past sensor readings are discarded whenever new data is available. In contrast, the global map displays sensor data over a wide area and never discards sensor data. Additionally, the global map allows the user to add annotations. RESULTS IMAGE DISPLAY Figure 5. Map display processing global map To evaluate the image display, we placed the robot in a setting which has difficult to sense characteristics: in front of the robot is a smooth, untextured wall; close to the robot is a large office plant. Figure 6 shows the image display for this scene with various overlays. As the figure shows, each sensor individually has problems, but collectively provides robust sensing of the environment. Figure 6. Sensor fusion based image display In the top left image (stereo only), the wall edges are clearly detected and the plant partially detected (the left side is too close for stereo correlation). However, the center of the wall (untextured) is completely missed. In the top right image (sonar only), the plant is detected well, but the wall is shown at incorrect depths due to specular reflection. In the middle left image (fused sonar and stereo), both the wall edge and plant are detected, but the center remains undetected. In the middle right image ( only), we see that the wall is well defined, but that the planar scan fails to see the plant. In the bottom image (all sensors), we see that all features are properly detected. The sonars detect the plant, the follows the wall, and stereo finds the wall edge. MAP DISPLAY Ladar, sonar, and stereo Map Building To evaluate map building, we placed the robot in a room with a variety of surfaces (smooth, rough, textured, nontextured). Figure 7 shows maps constructed with different sensors combinations. In the first image (stereo only) we see some clearly defined corners, but some walls are not well detected due to lack of texture. In the second image (sonar only), the sonar s low angular resolution and specular reflections result in poorly defined contours. In the third image (stereo and sonar), both corners and walls are well detected, however due to stereo s non-linear depth accuracy, there is significant error. In the final image ( only), the map clearly shows the room. Obviously, for an

6 Stereo only Sonar only Figure 8 shows an annotated map with several labels (numbered 1 to 4). As the figure shows, it is not evident that the center of the map (label 4) shows a set of desks. Thus, the operator has chosen to mark this region with the comment Here are desks and to include a camera image. Figure 8 also shows virtual obstacles (black lines and regions): a door (near label 2) and unsafe areas (table: labels 1 and 3) have all been fenced off. OBSTACLE DETECTION Sonar and stereo indoor environment in which the principal features are uniformly vertical (i.e., walls), the produces the most useful maps. Map Annotation Figure 7. Map display Ladar only We found that annotation greatly improved the usefulness of the map display in two ways. First, labeling helped operators to preserve contextual information. Second, virtual obstacles enabled operators to add critical information which was missed by the sensors. Since the maps only show a 2D projection of the world, we found that obstacles with irregular profiles are difficult to distinguish without labeling. For example, chairs are often hard to identify: they may be shown as a set of points (legs sensed), a line (horizontal edges sensed), or a scattered cloud of points. By allowing the user to add labels, features on the maps became easier to interpret. One of the most challenging tasks during vehicle teleoperation is obstacle detection. Although no sensor exists which always detects all obstacles, by exploiting complementary sensor characteristics, we can avoid individual sensor failures and improve obstacle detection. stereo Figure 9. Detection of small obstacle box (stereo) sonar Figure 9 shows a scene with a box on the floor. Because the box is too small, it is not detected by the (it is too short to intersect the scanning plane) nor by the sonars (it is located outside the sonar cones). However, it is properly detected by stereo as both displays show. We found that by letting operators create virtual obstacles, the safety and robustness of remote driving was improved. For example, by drawing lines on the map, operators were easily able to keep the robot away from dangerous, but difficult to sense obstacles. chair () chair Figure 10. Detection of a chair Figure 8. An annotated map Figure 10 shows a situation in which the robot is approaching a chair. We can see that the chair is well detected by the stereo camera and the sonars. However, the has problems with the chair because only the supporting post intersects the scanning plane (resulting in the a tiny spot on the map).

7 COMMAND GENERATION In order for remote driving to be efficient, the interface must provide capable command generation tools. Thus, in addition to the motion pad, we used the image and map displays to generate position commands. Doubleclicking on the image selects a target point: heading is computed by the offset from the central image axis and distance is based on data. Similarly, double-clicking on the map selects a relative position. In both cases, once a target point is designated, the robot computes a path and begins moving to that position. If the robot discovers an obstacle during the automatic motion, it automatically slows down and informs the operator (see Figure 2). If the obstacle becomes too close, the robot stops and warns the operator that it cannot continue in this direction. At any time during the automatic motion, the user can again take control of the robot by clicking on the motion pad. DECISION MAKING TOOLS Distance Measurement Most human senses are relative and not absolute. For example, we can say if something is hot or very hot, but not what is the absolute temperature. Human vision works much the same way: judging relative distance is easier (and more precise) than estimating absolute distances. For remote driving, however, distance information is indispensable for choosing paths and making decisions. Thus, we added a measurement tool to the image display (Figure 11). Clearance Check Another significant problem faced during remote driving is clearance estimation: will the vehicle be able to pass through an area or underneath something without getting stuck? Thus, we provided a tool for checking clearance. The robot s dimensions are represented by a rectangle overlaid on the image (Figure 12). The size of the rectangle is scaled based on the point being checked. Also, the rectangle is colored to reflect depth. When the operator clicks anywhere in the image, the tool projects the rectangle and displays the clearance required by the robot. Finding Correspondences Figure 12. Clearance check In some situations, the operator has difficulties finding the correspondence between a point on the map and a point on the image (and vice-versa). Since this information is critical for navigating and for accurately annotating the map, we implemented a tool to display matching points (Figure 13). distance to point Figure 11. Depth measurement When the operator clicks on the image or the map display, the interface automatically computes the absolute distance based on the available data. The distance is then overlaid on the display and the corresponding point is highlighted (e.g., if the user clicks a point on the image display, the corresponding point is shown on the map display). heading to point Figure 13. Finding correspondences When the operator clicks on the image, the distance to the point and the corresponding map point (with the vector to the point) are shown. If the depth of the chosen point is not available, only the vector is drawn. Similarly, when the operator clicks on the map, a vertical line is drawn on the image: a point on the map corresponds to a vertical line in the image since the map is an orthogonal projection. The line is colored based on distance.

8 FUTURE WORK A number of improvements would make our system more reliable. First, we currently estimate robot position with dead-reckoned odometry. Since we use robot position to register sensor data, better positioning would improve accuracy. Second, our map building algorithm is very simplistic: we perform no data filtering and no registration. However, by applying techniques such as[13], we could create large-area maps and benefit from mapbased localization. Finally, a higher level of robot autonomy would lessen the impact of operator differences and handicaps on system performance. For example, improving the robot s ability to identify hazards would reduce the need for the operator to be in continuous control. To date, we have only collected anecdotal evidence that the sensor fusion interface improves remote driving. In order to rigorously assess usability and to better understand which features/affordances are most helpful, we need to conduct formal user studies. In particular, we would like to quantify how each display and tool improves or hinders task performance. Moreover, we need to understand the interface s limitations: when does it work well and when does it fail? Lastly, we have carefully tuned our system for remote indoor driving based on sensor characteristics and environmental constraints. For example, our fusion algorithms give highest priority to data because the is ideal for sensing walls and objects in an office environment. But, if we wish to perform a different task, such as exploration in natural terrain, in which the would be less effective (due to smoke/fog effects, distance to obstacles, etc.), we need to understand how to modify the system (e.g., what weighting factors need to be changed). CONCLUSION The guiding principal in our work is that intelligent interfaces between humans and robots improve teleoperation performance. In particular, we believe that truly integrated and efficient human-robot systems can only be achieved through the development of better interfaces. We have found that with an appropriate sensor suite and user interface, sensor fusion is a powerful method for improving vehicle teleoperation. In our system, we used a suite of sonar, stereo vision, and sensors to remotely drive a mobile robot. Through our work, we demonstrated that a multisensor interface makes it easier to understand the remote environment, to assess the situation, to make decisions and to effect control. ACKNOWLEDGMENTS We would like to thank Bart Nabbe and Jianbo Shi for their helpful discussions. We would also like to thank to Martial Hebert and the CMU TMR Group for their assistance and for providing hardware. This work is partially supported by the DARPA TTO Tactical Mobile Robots program (NASA JPL ) and by SAIC, Inc. REFERENCES 1. Meier, R., Fong, T., Thorpe, C., and Baur, C., A Sensor Fusion Based User Interface for Vehicle Teleoperation, Field and Service Robots, Pittsburgh, PA, August Meier, R., Sensor Fusion for Teleoperation of a Mobile Robot, Diplome Thesis, Swiss Federal Institute of Technology Lausanne (EPFL), March Hine, B., et. al., VEVI: A Virtual Environment Teleoperation Interface for Planetary Exploration, SAE 25th ICES, San Diego, CA, July Foyle, D., Sensor Fusion Display Evaluation using Information Integration Models in Enhanced/Synthetic Vision Applications, AVID Workshop, NASA Conference Proceedings 10128, Moffett Field, CA, December Foyle, D.C., et. al., Enhanced/synthetic vision systems: Human factors research and implications for future systems, Paper , SAE Aerotech Meeting, Wise, J., Design of Sensor Fusion Displays: Human Factors and Display System Guidelines. Westinghouse R&D Center Research Report 87-1C60-SCVIS-R1, Terrien, G., Sensor Fusion Interface for Teleoperation, Diplome Thesis, Swiss Federal Institute of Technology Lausanne (EPFL), March Azuma, R., A Survey of Augmented Reality, in Presence: Teleoperators and Virtual Environments 6(4), August Thomas, B., et. al., A Wearable Computer System with Augmented Reality to Support Terrestrial Navigation, Second International Symposium on Wearable Computers, Pittsburgh, PA, October Bauer, M., et. al., A Collaborative Wearable System with Remote Sensing, Second International Symposium on Wearable Computers, Pittsburgh, PA, October Hollerer, T., Feiner, S., and Pavlik, Situated Documentaries: Embedding Multimedia Presentations in the Real World, Third International Symposium on Wearable Computers, San Francisco, CA, October Konolige, K., Small Vision System: Hardware and Implementation, Eight International Symposium on Robotics Research, Hayama, Japan, Thrun, S., et. al., A Real-Time Algorithm for Mobile Robot Mapping with Applications to Multi-Robot and 3D Mapping, IEEE ICRA, San Francisco, CA, April 2000.

A Sensor Fusion Based User Interface for Vehicle Teleoperation

A Sensor Fusion Based User Interface for Vehicle Teleoperation A Sensor Fusion Based User Interface for Vehicle Teleoperation Roger Meier 1, Terrence Fong 2, Charles Thorpe 2, and Charles Baur 1 1 Institut de Systèms Robotiques 2 The Robotics Institute L Ecole Polytechnique

More information

A SENSOR FUSION USER INTERFACE FOR MOBILE ROBOTS TELEOPERATION

A SENSOR FUSION USER INTERFACE FOR MOBILE ROBOTS TELEOPERATION UPB Sci. Bull., Series C Vol. 69, No.3, 2007 ISSN 1454-234x A SENSOR FUSION USER INTERFACE FOR MOBILE ROBOTS TELEOPERATION Ctin NEGRESCU 1 Fuziunea senzorială este aplicată tradiţional pentru reducerea

More information

Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Web-based Tools

Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Web-based Tools Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Web-based Tools Terrence Fong 1, Charles Thorpe 1 and Charles Baur 2 1 The Robotics Institute 2 Institut

More information

Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Remote Driving Tools

Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Remote Driving Tools Autonomous Robots 11, 77 85, 2001 c 2001 Kluwer Academic Publishers. Manufactured in The Netherlands. Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Remote

More information

Effective Vehicle Teleoperation on the World Wide Web

Effective Vehicle Teleoperation on the World Wide Web IEEE International Conference on Robotics and Automation (ICRA 2000), San Francisco, CA, April 2000 Effective Vehicle Teleoperation on the World Wide Web Sébastien Grange 1, Terrence Fong 2 and Charles

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful? Brainstorm In addition to cameras / Kinect, what other kinds of sensors would be useful? How do you evaluate different sensors? Classification of Sensors Proprioceptive sensors measure values internally

More information

Intelligent Robotics Sensors and Actuators

Intelligent Robotics Sensors and Actuators Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction

More information

Range Sensing strategies

Range Sensing strategies Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart and Nourbakhsh 4.1.6 Range Sensors (time of flight) (1) Large range distance measurement -> called

More information

Multi-robot remote driving with collaborative control

Multi-robot remote driving with collaborative control IEEE International Workshop on Robot-Human Interactive Communication, September 2001, Bordeaux and Paris, France Multi-robot remote driving with collaborative control Terrence Fong 1,2, Sébastien Grange

More information

International Journal of Informative & Futuristic Research ISSN (Online):

International Journal of Informative & Futuristic Research ISSN (Online): Reviewed Paper Volume 2 Issue 4 December 2014 International Journal of Informative & Futuristic Research ISSN (Online): 2347-1697 A Survey On Simultaneous Localization And Mapping Paper ID IJIFR/ V2/ E4/

More information

An Agent-Based Architecture for an Adaptive Human-Robot Interface

An Agent-Based Architecture for an Adaptive Human-Robot Interface An Agent-Based Architecture for an Adaptive Human-Robot Interface Kazuhiko Kawamura, Phongchai Nilas, Kazuhiko Muguruma, Julie A. Adams, and Chen Zhou Center for Intelligent Systems Vanderbilt University

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT 1 Rudolph P. Darken, 1 Joseph A. Sullivan, and 2 Jeffrey Mulligan 1 Naval Postgraduate School,

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation

Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation Terrence Fong and Charles Thorpe The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania USA {terry, cet}@ri.cmu.edu

More information

Creating a 3D environment map from 2D camera images in robotics

Creating a 3D environment map from 2D camera images in robotics Creating a 3D environment map from 2D camera images in robotics J.P. Niemantsverdriet jelle@niemantsverdriet.nl 4th June 2003 Timorstraat 6A 9715 LE Groningen student number: 0919462 internal advisor:

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED

ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 239 ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY

More information

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc. Leddar optical time-of-flight sensing technology, originally discovered by the National Optics Institute (INO) in Quebec City and developed and commercialized by LeddarTech, is a unique LiDAR technology

More information

NAVIGATION is an essential element of many remote

NAVIGATION is an essential element of many remote IEEE TRANSACTIONS ON ROBOTICS, VOL.??, NO.?? 1 Ecological Interfaces for Improving Mobile Robot Teleoperation Curtis Nielsen, Michael Goodrich, and Bob Ricks Abstract Navigation is an essential element

More information

FLASH LiDAR KEY BENEFITS

FLASH LiDAR KEY BENEFITS In 2013, 1.2 million people died in vehicle accidents. That is one death every 25 seconds. Some of these lives could have been saved with vehicles that have a better understanding of the world around them

More information

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim MEM380 Applied Autonomous Robots I Winter 2011 Feedback Control USARSim Transforming Accelerations into Position Estimates In a perfect world It s not a perfect world. We have noise and bias in our acceleration

More information

Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation

Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation Terry Fong The Robotics Institute Carnegie Mellon University Thesis Committee Chuck Thorpe (chair) Charles Baur (EPFL) Eric Krotkov

More information

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Sensors & Systems for Human Safety Assurance in Collaborative Exploration Sensing and Sensors CMU SCS RI 16-722 S09 Ned Fox nfox@andrew.cmu.edu Outline What is collaborative exploration? Humans sensing robots Robots sensing humans Overseers sensing both Inherently safe systems

More information

INTELLIGENT UNMANNED GROUND VEHICLES Autonomous Navigation Research at Carnegie Mellon

INTELLIGENT UNMANNED GROUND VEHICLES Autonomous Navigation Research at Carnegie Mellon INTELLIGENT UNMANNED GROUND VEHICLES Autonomous Navigation Research at Carnegie Mellon THE KLUWER INTERNATIONAL SERIES IN ENGINEERING AND COMPUTER SCIENCE ROBOTICS: VISION, MANIPULATION AND SENSORS Consulting

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Visual compass for the NIFTi robot

Visual compass for the NIFTi robot CENTER FOR MACHINE PERCEPTION CZECH TECHNICAL UNIVERSITY IN PRAGUE Visual compass for the NIFTi robot Tomáš Nouza nouzato1@fel.cvut.cz June 27, 2013 TECHNICAL REPORT Available at https://cw.felk.cvut.cz/doku.php/misc/projects/nifti/sw/start/visual

More information

Microwave Remote Sensing (1)

Microwave Remote Sensing (1) Microwave Remote Sensing (1) Microwave sensing encompasses both active and passive forms of remote sensing. The microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength.

More information

GPS data correction using encoders and INS sensors

GPS data correction using encoders and INS sensors GPS data correction using encoders and INS sensors Sid Ahmed Berrabah Mechanical Department, Royal Military School, Belgium, Avenue de la Renaissance 30, 1000 Brussels, Belgium sidahmed.berrabah@rma.ac.be

More information

Helicopter Aerial Laser Ranging

Helicopter Aerial Laser Ranging Helicopter Aerial Laser Ranging Håkan Sterner TopEye AB P.O.Box 1017, SE-551 11 Jönköping, Sweden 1 Introduction Measuring distances with light has been used for terrestrial surveys since the fifties.

More information

Hybrid architectures. IAR Lecture 6 Barbara Webb

Hybrid architectures. IAR Lecture 6 Barbara Webb Hybrid architectures IAR Lecture 6 Barbara Webb Behaviour Based: Conclusions But arbitrary and difficult to design emergent behaviour for a given task. Architectures do not impose strong constraints Options?

More information

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017 23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS Sergii Bykov Technical Lead Machine Learning 12 Oct 2017 Product Vision Company Introduction Apostera GmbH with headquarter in Munich, was

More information

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

Objective Data Analysis for a PDA-Based Human-Robotic Interface* Objective Data Analysis for a PDA-Based Human-Robotic Interface* Hande Kaymaz Keskinpala EECS Department Vanderbilt University Nashville, TN USA hande.kaymaz@vanderbilt.edu Abstract - This paper describes

More information

Research on 3-D measurement system based on handheld microscope

Research on 3-D measurement system based on handheld microscope Proceedings of the 4th IIAE International Conference on Intelligent Systems and Image Processing 2016 Research on 3-D measurement system based on handheld microscope Qikai Li 1,2,*, Cunwei Lu 1,**, Kazuhiro

More information

DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Introduction The Project ADVISE-PRO

DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Introduction The Project ADVISE-PRO DLR Project ADVISE-PRO Advanced Visual System for Situation Awareness Enhancement Prototype Dr. Bernd Korn DLR, Institute of Flight Guidance Lilienthalplatz 7 38108 Braunschweig Bernd.Korn@dlr.de phone

More information

Mobile Robots (Wheeled) (Take class notes)

Mobile Robots (Wheeled) (Take class notes) Mobile Robots (Wheeled) (Take class notes) Wheeled mobile robots Wheeled mobile platform controlled by a computer is called mobile robot in a broader sense Wheeled robots have a large scope of types and

More information

Solar Powered Obstacle Avoiding Robot

Solar Powered Obstacle Avoiding Robot Solar Powered Obstacle Avoiding Robot S.S. Subashka Ramesh 1, Tarun Keshri 2, Sakshi Singh 3, Aastha Sharma 4 1 Asst. professor, SRM University, Chennai, Tamil Nadu, India. 2, 3, 4 B.Tech Student, SRM

More information

Distributed Virtual Environments!

Distributed Virtual Environments! Distributed Virtual Environments! Introduction! Richard M. Fujimoto! Professor!! Computational Science and Engineering Division! College of Computing! Georgia Institute of Technology! Atlanta, GA 30332-0765,

More information

C-ELROB 2009 Technical Paper Team: University of Oulu

C-ELROB 2009 Technical Paper Team: University of Oulu C-ELROB 2009 Technical Paper Team: University of Oulu Antti Tikanmäki, Juha Röning University of Oulu Intelligent Systems Group Robotics Group sunday@ee.oulu.fi Abstract Robotics Group is a part of Intelligent

More information

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1 Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can

More information

RoBotanic: a Robot Guide for Botanical Gardens. Early steps.

RoBotanic: a Robot Guide for Botanical Gardens. Early steps. RoBotanic: a Robot Guide for Botanical Gardens. Early steps. Antonio Chella, Irene Macaluso, Daniele Peri, and Lorenzo Riano Department of Computer Engineering (DINFO) University of Palermo, Ed.6 viale

More information

MMW sensors for Industrial, safety, Traffic and security applications

MMW sensors for Industrial, safety, Traffic and security applications MMW sensors for Industrial, safety, Traffic and security applications Philip Avery Director, Navtech Radar Ltd. Overview Introduction to Navtech Radar and what we do. A brief explanation of how FMCW radars

More information

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER Nils Gageik, Thilo Müller, Sergio Montenegro University of Würzburg, Aerospace Information Technology

More information

Robust Positioning for Urban Traffic

Robust Positioning for Urban Traffic Robust Positioning for Urban Traffic Motivations and Activity plan for the WG 4.1.4 Dr. Laura Ruotsalainen Research Manager, Department of Navigation and positioning Finnish Geospatial Research Institute

More information

Shoichi MAEYAMA Akihisa OHYA and Shin'ichi YUTA. University of Tsukuba. Tsukuba, Ibaraki, 305 JAPAN

Shoichi MAEYAMA Akihisa OHYA and Shin'ichi YUTA. University of Tsukuba. Tsukuba, Ibaraki, 305 JAPAN Long distance outdoor navigation of an autonomous mobile robot by playback of Perceived Route Map Shoichi MAEYAMA Akihisa OHYA and Shin'ichi YUTA Intelligent Robot Laboratory Institute of Information Science

More information

Terrence Fong and Charles Thorpe The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania USA { terry, cet

Terrence Fong and Charles Thorpe The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania USA { terry, cet From: AAAI Technical Report SS-99-06. Compilation copyright 1999, AAAI (www.aaai.org). All rights reserved. Collaborative Control: A Robot-Centric Model for Vehicle Teleoperation Terrence Fong and Charles

More information

A Safeguarded Teleoperation Controller

A Safeguarded Teleoperation Controller IEEE International onference on Advanced Robotics 2001, August 2001, Budapest, Hungary A Safeguarded Teleoperation ontroller Terrence Fong 1, harles Thorpe 1 and harles Baur 2 1 The Robotics Institute

More information

MOBILE ROBOTICS. Sensors An Introduction

MOBILE ROBOTICS. Sensors An Introduction CY 02CFIC CFIDV RO OBOTIC CA 01 MOBILE ROBOTICS Sensors An Introduction Basilio Bona DAUIN Politecnico di Torino Basilio Bona DAUIN Politecnico di Torino 001/1 CY CA 01CFIDV 02CFIC OBOTIC RO An Example

More information

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011 Sponsored by Nisarg Kothari Carnegie Mellon University April 26, 2011 Motivation Why indoor localization? Navigating malls, airports, office buildings Museum tours, context aware apps Augmented reality

More information

SONAR THEORY AND APPLICATIONS

SONAR THEORY AND APPLICATIONS SONAR THEORY AND APPLICATIONS EXCERPT FROM IMAGENEX MODEL 855 COLOR IMAGING SONAR USER'S MANUAL IMAGENEX TECHNOLOGY CORP. #209-1875 BROADWAY ST. PORT COQUITLAM, B.C. V3C 4Z1 CANADA TEL: (604) 944-8248

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Overview of Challenges in the Development of Autonomous Mobile Robots. August 23, 2011

Overview of Challenges in the Development of Autonomous Mobile Robots. August 23, 2011 Overview of Challenges in the Development of Autonomous Mobile Robots August 23, 2011 What is in a Robot? Sensors Effectors and actuators (i.e., mechanical) Used for locomotion and manipulation Controllers

More information

SICK AG WHITEPAPER HDDM + INNOVATIVE TECHNOLOGY FOR DISTANCE MEASUREMENT FROM SICK

SICK AG WHITEPAPER HDDM + INNOVATIVE TECHNOLOGY FOR DISTANCE MEASUREMENT FROM SICK SICK AG WHITEPAPER HDDM + INNOVATIVE TECHNOLOGY FOR DISTANCE MEASUREMENT FROM SICK 2017-11 AUTHOR Dr. Thorsten Theilig Head of Product Unit Long Range Distance Sensors at SICK AG in Waldkirch / Germany

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

Sensor Data Fusion Using Kalman Filter

Sensor Data Fusion Using Kalman Filter Sensor Data Fusion Using Kalman Filter J.Z. Sasiade and P. Hartana Department of Mechanical & Aerospace Engineering arleton University 115 olonel By Drive Ottawa, Ontario, K1S 5B6, anada e-mail: jsas@ccs.carleton.ca

More information

Robot Hardware Non-visual Sensors. Ioannis Rekleitis

Robot Hardware Non-visual Sensors. Ioannis Rekleitis Robot Hardware Non-visual Sensors Ioannis Rekleitis Robot Sensors Sensors are devices that can sense and measure physical properties of the environment, e.g. temperature, luminance, resistance to touch,

More information

Cognitive robotics using vision and mapping systems with Soar

Cognitive robotics using vision and mapping systems with Soar Cognitive robotics using vision and mapping systems with Soar Lyle N. Long, Scott D. Hanford, and Oranuj Janrathitikarn The Pennsylvania State University, University Park, PA USA 16802 ABSTRACT The Cognitive

More information

Paper on: Optical Camouflage

Paper on: Optical Camouflage Paper on: Optical Camouflage PRESENTED BY: I. Harish teja V. Keerthi E.C.E E.C.E E-MAIL: Harish.teja123@gmail.com kkeerthi54@gmail.com 9533822365 9866042466 ABSTRACT: Optical Camouflage delivers a similar

More information

Intelligent Vehicle Localization Using GPS, Compass, and Machine Vision

Intelligent Vehicle Localization Using GPS, Compass, and Machine Vision The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems October 11-15, 2009 St. Louis, USA Intelligent Vehicle Localization Using GPS, Compass, and Machine Vision Somphop Limsoonthrakul,

More information

3D Animation of Recorded Flight Data

3D Animation of Recorded Flight Data 3D Animation of Recorded Flight Data *Carole Bolduc **Wayne Jackson *Software Kinetics Ltd, 65 Iber Rd, Stittsville, Ontario, Canada K2S 1E7 Tel: (613) 831-0888, Email: Carole.Bolduc@SoftwareKinetics.ca

More information

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington Department of Computer Science and Engineering The University of Texas at Arlington Team Autono-Mo Jacobia Architecture Design Specification Team Members: Bill Butts Darius Salemizadeh Lance Storey Yunesh

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

A Comparative Study of Structured Light and Laser Range Finding Devices

A Comparative Study of Structured Light and Laser Range Finding Devices A Comparative Study of Structured Light and Laser Range Finding Devices Todd Bernhard todd.bernhard@colorado.edu Anuraag Chintalapally anuraag.chintalapally@colorado.edu Daniel Zukowski daniel.zukowski@colorado.edu

More information

Localization (Position Estimation) Problem in WSN

Localization (Position Estimation) Problem in WSN Localization (Position Estimation) Problem in WSN [1] Convex Position Estimation in Wireless Sensor Networks by L. Doherty, K.S.J. Pister, and L.E. Ghaoui [2] Semidefinite Programming for Ad Hoc Wireless

More information

LOCALIZATION WITH GPS UNAVAILABLE

LOCALIZATION WITH GPS UNAVAILABLE LOCALIZATION WITH GPS UNAVAILABLE ARES SWIEE MEETING - ROME, SEPT. 26 2014 TOR VERGATA UNIVERSITY Summary Introduction Technology State of art Application Scenarios vs. Technology Advanced Research in

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

TECHNOLOGY DEVELOPMENT AREAS IN AAWA

TECHNOLOGY DEVELOPMENT AREAS IN AAWA TECHNOLOGY DEVELOPMENT AREAS IN AAWA Technologies for realizing remote and autonomous ships exist. The task is to find the optimum way to combine them reliably and cost effecticely. Ship state definition

More information

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Dipartimento di Elettronica Informazione e Bioingegneria Robotics Dipartimento di Elettronica Informazione e Bioingegneria Robotics Behavioral robotics @ 2014 Behaviorism behave is what organisms do Behaviorism is built on this assumption, and its goal is to promote

More information

POSITIONING AN AUTONOMOUS OFF-ROAD VEHICLE BY USING FUSED DGPS AND INERTIAL NAVIGATION. T. Schönberg, M. Ojala, J. Suomela, A. Torpo, A.

POSITIONING AN AUTONOMOUS OFF-ROAD VEHICLE BY USING FUSED DGPS AND INERTIAL NAVIGATION. T. Schönberg, M. Ojala, J. Suomela, A. Torpo, A. POSITIONING AN AUTONOMOUS OFF-ROAD VEHICLE BY USING FUSED DGPS AND INERTIAL NAVIGATION T. Schönberg, M. Ojala, J. Suomela, A. Torpo, A. Halme Helsinki University of Technology, Automation Technology Laboratory

More information

Active and Passive Microwave Remote Sensing

Active and Passive Microwave Remote Sensing Active and Passive Microwave Remote Sensing Passive remote sensing system record EMR that was reflected (e.g., blue, green, red, and near IR) or emitted (e.g., thermal IR) from the surface of the Earth.

More information

NAVIGATION OF MOBILE ROBOTS

NAVIGATION OF MOBILE ROBOTS MOBILE ROBOTICS course NAVIGATION OF MOBILE ROBOTS Maria Isabel Ribeiro Pedro Lima mir@isr.ist.utl.pt pal@isr.ist.utl.pt Instituto Superior Técnico (IST) Instituto de Sistemas e Robótica (ISR) Av.Rovisco

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

Development of intelligent systems

Development of intelligent systems Development of intelligent systems (RInS) Robot sensors Danijel Skočaj University of Ljubljana Faculty of Computer and Information Science Academic year: 2017/18 Development of intelligent systems Robotic

More information

Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving

Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving FEATURED ARTICLES Autonomous Driving Technology for Connected Cars Advances in Vehicle Periphery Sensing Techniques Aimed at Realizing Autonomous Driving Progress is being made on vehicle periphery sensing,

More information

DESIGN & DEVELOPMENT OF COLOR MATCHING ALGORITHM FOR IMAGE RETRIEVAL USING HISTOGRAM AND SEGMENTATION TECHNIQUES

DESIGN & DEVELOPMENT OF COLOR MATCHING ALGORITHM FOR IMAGE RETRIEVAL USING HISTOGRAM AND SEGMENTATION TECHNIQUES International Journal of Information Technology and Knowledge Management July-December 2011, Volume 4, No. 2, pp. 585-589 DESIGN & DEVELOPMENT OF COLOR MATCHING ALGORITHM FOR IMAGE RETRIEVAL USING HISTOGRAM

More information

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path Taichi Yamada 1, Yeow Li Sa 1 and Akihisa Ohya 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba, 1-1-1,

More information

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Survey of Mobile Augmentation for Mobile Augmented Reality System A Survey of Mobile Augmentation for Mobile Augmented Reality System Mr.A.T.Vasaya 1, Mr.A.S.Gohil 2 1 PG Student, C.U.Shah College of Engineering and Technology, Gujarat, India 2 Asst.Proffesor, Sir Bhavsinhji

More information

Robotics Enabling Autonomy in Challenging Environments

Robotics Enabling Autonomy in Challenging Environments Robotics Enabling Autonomy in Challenging Environments Ioannis Rekleitis Computer Science and Engineering, University of South Carolina CSCE 190 21 Oct. 2014 Ioannis Rekleitis 1 Why Robotics? Mars exploration

More information

Simulation of a mobile robot navigation system

Simulation of a mobile robot navigation system Edith Cowan University Research Online ECU Publications 2011 2011 Simulation of a mobile robot navigation system Ahmed Khusheef Edith Cowan University Ganesh Kothapalli Edith Cowan University Majid Tolouei

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

Novel interfaces for remote driving: gesture, haptic and PDA

Novel interfaces for remote driving: gesture, haptic and PDA Novel interfaces for remote driving: gesture, haptic and PDA Terrence Fong a*, François Conti b, Sébastien Grange b, Charles Baur b a The Robotics Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania

More information

A Comparison Between Camera Calibration Software Toolboxes

A Comparison Between Camera Calibration Software Toolboxes 2016 International Conference on Computational Science and Computational Intelligence A Comparison Between Camera Calibration Software Toolboxes James Rothenflue, Nancy Gordillo-Herrejon, Ramazan S. Aygün

More information

Fire Fighter Location Tracking & Status Monitoring Performance Requirements

Fire Fighter Location Tracking & Status Monitoring Performance Requirements Fire Fighter Location Tracking & Status Monitoring Performance Requirements John A. Orr and David Cyganski orr@wpi.edu, cyganski@wpi.edu Electrical and Computer Engineering Department Worcester Polytechnic

More information

A Study on Developing Image Processing for Smart Traffic Supporting System Based on AR

A Study on Developing Image Processing for Smart Traffic Supporting System Based on AR Proceedings of the 2 nd World Congress on Civil, Structural, and Environmental Engineering (CSEE 17) Barcelona, Spain April 2 4, 2017 Paper No. ICTE 111 ISSN: 2371-5294 DOI: 10.11159/icte17.111 A Study

More information

RADAR (RAdio Detection And Ranging)

RADAR (RAdio Detection And Ranging) RADAR (RAdio Detection And Ranging) CLASSIFICATION OF NONPHOTOGRAPHIC REMOTE SENSORS PASSIVE ACTIVE DIGITAL CAMERA THERMAL (e.g. TIMS) VIDEO CAMERA MULTI- SPECTRAL SCANNERS VISIBLE & NIR MICROWAVE Real

More information

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005) Project title: Optical Path Tracking Mobile Robot with Object Picking Project number: 1 A mobile robot controlled by the Altera UP -2 board and/or the HC12 microprocessor will have to pick up and drop

More information

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM Annals of the University of Petroşani, Mechanical Engineering, 8 (2006), 73-78 73 VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM JOZEF NOVÁK-MARCINČIN 1, PETER BRÁZDA 2 Abstract: Paper describes

More information

Color and More. Color basics

Color and More. Color basics Color and More In this lesson, you'll evaluate an image in terms of its overall tonal range (lightness, darkness, and contrast), its overall balance of color, and its overall appearance for areas that

More information

Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats

Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats Mr. Amos Gellert Technological aspects of level crossing facilities Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings Deputy General Manager

More information

AE4-393: Avionics Exam Solutions

AE4-393: Avionics Exam Solutions AE4-393: Avionics Exam Solutions 2008-01-30 1. AVIONICS GENERAL a) WAAS: Wide Area Augmentation System: an air navigation aid developed by the Federal Aviation Administration to augment the Global Positioning

More information

Time-Lapse Panoramas for the Egyptian Heritage

Time-Lapse Panoramas for the Egyptian Heritage Time-Lapse Panoramas for the Egyptian Heritage Mohammad NABIL Anas SAID CULTNAT, Bibliotheca Alexandrina While laser scanning and Photogrammetry has become commonly-used methods for recording historical

More information

Keywords: Multi-robot adversarial environments, real-time autonomous robots

Keywords: Multi-robot adversarial environments, real-time autonomous robots ROBOT SOCCER: A MULTI-ROBOT CHALLENGE EXTENDED ABSTRACT Manuela M. Veloso School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213, USA veloso@cs.cmu.edu Abstract Robot soccer opened

More information

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE G. Pires, U. Nunes, A. T. de Almeida Institute of Systems and Robotics Department of Electrical Engineering University of Coimbra, Polo II 3030

More information

White paper on SP25 millimeter wave radar

White paper on SP25 millimeter wave radar White paper on SP25 millimeter wave radar Hunan Nanoradar Science and Technology Co.,Ltd. Version history Date Version Version description 2016-08-22 1.0 the 1 st version of white paper on SP25 Contents

More information

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011) Lecture 19: Depth Cameras Kayvon Fatahalian CMU 15-869: Graphics and Imaging Architectures (Fall 2011) Continuing theme: computational photography Cheap cameras capture light, extensive processing produces

More information

Using Dynamic Views. Module Overview. Module Prerequisites. Module Objectives

Using Dynamic Views. Module Overview. Module Prerequisites. Module Objectives Using Dynamic Views Module Overview The term dynamic views refers to a method of composing drawings that is a new approach to managing projects. Dynamic views can help you to: automate sheet creation;

More information

Intelligent. Mobile Robots. Robots that know where they re going. Since 1995.

Intelligent. Mobile Robots. Robots that know where they re going. Since 1995. Intelligent Mobile Robots Robots that know where they re going. Since 1995. Robots & Controls for MobileRobots Inc offers OEMs, integrators and dealers robust, reliable robot controls and bases with our

More information

User interface for remote control robot

User interface for remote control robot User interface for remote control robot Gi-Oh Kim*, and Jae-Wook Jeon ** * Department of Electronic and Electric Engineering, SungKyunKwan University, Suwon, Korea (Tel : +8--0-737; E-mail: gurugio@ece.skku.ac.kr)

More information