ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED

Similar documents
Human-Robot Interaction in a Robotic Guide for the Visually Impaired

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

ROVI: A Robot for Visually Impaired for Collision- Free Navigation

EFFECTIVE NAVIGATION FOR VISUALLY IMPAIRED BY WEARABLE OBSTACLE AVOIDANCE SYSTEM

Target Tracking and Obstacle Avoidance for Mobile Robots

GPS Based Virtual Eye For Visionless

SMART VIBRATING BAND TO INTIMATE OBSTACLE FOR VISUALLY IMPAIRED

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Solar Powered Obstacle Avoiding Robot

NAVBELT AND GUIDECANE

Automated Mobility and Orientation System for Blind

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

AUDITORY GUIDANCE WITH THE NAVBELT - A COMPUTERIZED

Indoor Navigation Approach for the Visually Impaired

Cognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks

International Journal of Informative & Futuristic Research ISSN (Online):

3D ULTRASONIC STICK FOR BLIND

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

A User Friendly Software Framework for Mobile Robot Control

International Journal of Pure and Applied Mathematics

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots

Sensor Data Fusion Using Kalman Filter

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Electronic Travel Aid for Amaurotic People

Nikhil Mahalingam 1, Veera S. Kumar 2 1,2 (Computer Science & Engineering, PSG College of Technology, India)

Design and Development of Blind Navigation System using GSM and RFID Technology

Intelligent Robotics Sensors and Actuators

Azaad Kumar Bahadur 1, Nishant Tripathi 2

Designing of a Shooting System Using Ultrasonic Radar Sensor

Substitute eyes for Blind using Android

Exploration of Unknown Environments Using a Compass, Topological Map and Neural Network

NAVIGATION OF MOBILE ROBOTS

INTELLIGENT WHITE CANE TO AID VISUALLY IMPAIRED

SELF-BALANCING MOBILE ROBOT TILTER

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Walking Assistance for blind Using Microcontroller in Indoor Navigation

Line Tracking Pick and Place Robot Using RFID Technology

An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques

Range Sensing strategies

CANE: A Wearable Computer-Assisted Navigation Engine for the Visually Impaired

UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING

DEMONSTRATION OF ROBOTIC WHEELCHAIR IN FUKUOKA ISLAND-CITY

Assisting and Guiding Visually Impaired in Indoor Environments

Key-Words: - Fuzzy Behaviour Controls, Multiple Target Tracking, Obstacle Avoidance, Ultrasonic Range Finders

Autonomous Wheelchair for Disabled People

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology

Creating a 3D environment map from 2D camera images in robotics

CCNY Smart Cane. Qingtian Chen 1, Muhammad Khan 1, Christina Tsangouri 2, Christopher Yang 2, Bing Li 1, Jizhong Xiao 1* and Zhigang Zhu 2*

Initial Report on Wheelesley: A Robotic Wheelchair System

The Smart Guide Cane an Enhanced Walking Cane for Assisting the Visually Challenged for Indoor

Brainstorm. In addition to cameras / Kinect, what other kinds of sensors would be useful?

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

A Cartesian Robot for RFID Signal Distribution Model Verification

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

The Design of Intelligent Wheelchair Based on MSP430

Sensing. Autonomous systems. Properties. Classification. Key requirement of autonomous systems. An AS should be connected to the outside world.

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Mobile Robots Exploration and Mapping in 2D

ARTIFICIAL INTELLIGENCE - ROBOTICS

Australian Journal of Basic and Applied Sciences

Application Note Using MagAlpha Devices to Replace Optical Encoders

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Portable Monitoring and Navigation Control System for Helping Visually Impaired People

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

Mechatronics Project Report

Path Planning for Mobile Robots Based on Hybrid Architecture Platform

AUTOMATIC NUMBER PLATE DETECTION USING IMAGE PROCESSING AND PAYMENT AT TOLL PLAZA

Team members: Christopher A. Urquhart Oluwaseyitan Joshua Durodola Nathaniel Sims

ABAid: Navigation Aid for Blind People Using Acoustic Signal

[Bhoge* et al., 5.(6): June, 2016] ISSN: IC Value: 3.00 Impact Factor: 4.116

A software video stabilization system for automotive oriented applications

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot

Computer Vision Based Real-Time Stairs And Door Detection For Indoor Navigation Of Visually Impaired People

International Journal of Innovations in Engineering and Technology (IJIET) Nadu, India

Estimation of Absolute Positioning of mobile robot using U-SAT

Abstract. 1. Introduction

Development of a Low-Cost SLAM Radar for Applications in Robotics

Smart Navigation System for Visually Impaired Person

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering. (An ISO 3297: 2007 Certified Organization)

Prototype Realization

A Design Study for the Haptic Vest as a Navigation System

Optimization Maze Robot Using A* and Flood Fill Algorithm

Homework 10: Patent Liability Analysis

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Auto-Fact Security System

I plan to build a four-legged robot with these objectives in mind:

Web-Based Mobile Robot Simulator

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

The Cricket Indoor Location System

The Robotic Busboy: Steps Towards Developing a Mobile Robotic Home Assistant

COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: MECHANICAL ENGINEERING

Performance Analysis of Ultrasonic Mapping Device and Radar

Transcription:

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 239 ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED OMER HAMEED BILAL NASEEM JAVAID IQBAL MUHAMMAD AHMAD OSMAN ANWAR SOHAIB AFZAL Department of Mechatronics Engineering College of Electrical and Mechanical Engineering, Peshawar Road, Rawalpindi National University of Sciences and Technology (NUST) ohameed@yahoo.com, bilalnaseem@hotmail.com, jiqbal-eme@nust.edu.pk, m-ahmad28@yahoo.com osmananwar@hotmail.com,sohaib-afzal@hotmail.com Abstract: - This paper discusses the conception and development of a semi-autonomous robot that aids visually impaired users in travelling by helping them make purely intuitive decisions i.e., the user makes the final decision. The core of the robot comprises of a steer-able base and a sensor suite mounted on the base. We have used fusion of multiple Sensors to achieve better results for Obstacle Avoidance and path planning and developed interlocks which provided quite accurate results. Keywords: Assistive Technology, Navigation Aid, Visually Impaired, Multi Sensor Fusion. 1. INTRODUCTION Over the past three decades, considerable research efforts have been made in the field of navigation for the visually impaired. The C-5 Laser Cane was built by Benjamin, et al [1]. The device uses optical triangulation with three laser diodes and three photodiodes as receivers. The Nottingham Obstacle Detector (NOD), designed by Bissit and Heyes [2], is a handheld sonar device that utilizes auditory feedback and categorizing it as a weak or a strong response. These devices were not very successful because: They required the user to actively scan the environment. This mode of human-machine interaction was very time-consuming. Also, interpreting audio signals was an added task for the already handicapped user. Borenstein, et al built the NavBelt [3], a device wearable around the waist equipped with an onboard computer. It used ultrasonic sensors for obstacle avoidance and translated the device s 120o field of view into audio directions. Ulrich, et al developed the GuideCane [4], a device with a long handle attached to a steer-able base and an array of ultrasonic sensors mounted on the base. More recently, radio frequency identification (RFID) based assisted systems are being developed but their application is limited primarily to indoor environments only [5]. A problem prevalent with assisted navigation systems is their infrequent deployment. Hence, it is difficult to compare results and to decide which one of them is best suited. In few cases, the user is required to wear an additional body gear, which causes physical fatigue. Assistive Technology (AT) cannot help all visually impaired individuals but it is useful for most of them. It is still advantageous over white cane and guide dogs. Also, guide dogs need to be trained and white cane needs time to get used to. Their use requires a substantial investment in time.

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 240 Whether intuitive and auditory responses contribute to cognitive load or not is debatable. Most users would like robots assisting them but they would want to make the final path planning decision themselves. This machine is purpose-built based on Assistive Technology (AT) with the aim to assist the visually impaired user to navigate in an unfamiliar environment. The prototype and its theory of operation is explained in Section 2. Section 3 provides fusion of sensors for obstacle avoidance and explains the intuitive response. Section 4 describes the control approach. Findings of the pilot experiments are discussed in Section 5. ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED (ATNAVI) In this section, we discuss the navigation system as depicted in Figure 1. 2. HARDWARE In this section we discuss the hardware of the navigation system we have designed to assist the visually impaired. It consists of a base that is to be steered by a DC motor. A semi-circular sensor suite is mounted on the base. An array of five ultrasonic sensors is present on this sensor suite and three infrared range finders are present on the base. The system is supported by two wheels on the base and three castors for stability purpose. The system never gets off-balanced and hence the sensors return reasonably accurate information. A handle is attached to the base. The handle is equipped with a miniature control pad which is used to change the direction of motion. A single board computer, a fluxgate magnetic compass, and optical encoders are also present on the system. The optical encoders are present on both the wheels and also on the motor shaft used to rotate the steer-able base. It has separate compartments for regulator circuitry, embedded microcontroller, motor drive circuits and batteries. Most of the mechanical structure is made of acrylic, which is lightweight and durable. Reinforced fibreglass was used instead of metal to reduce weight of the system. However, aluminium was used wherever fibreglass failed to deliver. 3. THEORY OF OPERATION The working principle of this system is very simple. The user pushes the system. The system moves forward. The sensors on the system detect an obstacle. The system steers the user around the obstacle to avoid the obstacle. The system gives a series of complex electronic indicators and the user gives an intuitive response to those indicators. Initially the user prescribes a Figure 1: The ATNAVI System

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 241 direction of motion. A joy pad is present on the handle. It has five buttons on it. All of them are present in a single row. The buttons are, from left to right: west, northwest, north, northeast, and east. The fluxgate compass helps in determining the initial direction. For example, if it is pointing north, and the user presses northeast, the computer adds 45o and rotates the base until the compass faces northeast. The user can now push the device in that direction. The system follows the same path until it comes across an obstacle. 4. FUSION OF SENSORS FOR OBSTACLE DETECTION We have used a combination of ultrasonic and infrared sensors for obstacle detection. Ultrasonic sensors are most commonly used for obstacle detection but due to the large field of view of the sonar beams uncertainty in interpreting the actual location of the obstacle increases. To overcome such shortcomings of the ultrasonic sensors, infrared sensors have been used along with an array of sonar s. 4.1 Ultrasonic Sensors Ultrasonic sensors are used to detect obstacles. A major disadvantage of using ultrasonic sensors is that their response is orientation dependent. There are two solutions to this problem: place the ultrasonic sensors closer so that sonar regions overlap or use a different sensor that confirms the ultrasonic sensor s action. We took both these steps as necessary measures to acquire precise information from ultrasonic sensors. Our prototype has five ultrasonic sensors mounted in a semicircular contour, at suitable angles between them. It also has three infrared sensors mounted on the static base, in a series, with 120 mm distance between them. Figure 2 shows the five ultrasonic sensors and the two extreme most of the three infrared sensors. The range of detection of the ultrasonic sensors can be adjusted. Sensor 1 is used in its maximum range (2000mm), sensors 2 and 3 in medium range (1300 mm), while sensors 4 and 5 in their short range (600 mm). When the system encounters an obstacle, sensor 1, gives a signal to the computer that an obstacle is present. The system then determines whether the obstacle is present in the overlapping region of sensors 1 and 2 or sensor 1 and 3. This would tell the computer which direction the system should avoid the obstacle from. If the obstacle is present in the overlapping region of sensors 1 and 2, the system would avoid the obstacle from the right. Once the obstacle is no longer present in the overlapping region of sensors 3 and 5 or sensors 2 and 4, the system knows the obstacle has been avoided. 4.2 Infrared Sensors Figure 2: Sensor Arrangement The infrared sensors are used to confirm the presence of an obstacle in the user s path. They only return a pulse in case an obstacle is present. If no pulse is received from the infrared sensor, it would mean that the obstacle has been avoided. They also indicate how big the object is. If any two adjacent infrared sensors return pulses, it would mean that the obstacle is greater than or

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 242 equal to 120 mm in size and so forth. Together, the ultrasonic sensors and the infrared sensors serve the purpose of eyes for the visually impaired. 4.3 Intuitive Response When the obstacle is detected, the base turns to avoid it. The user feels the base turning through the handle as he/she experiences a direct physical force due to the change in directions of the base and handle. Intuitively the user turns with the system. If the system reaches a point where it cannot decide which path to take then it comes to a stop and tells the user to stop. After avoiding the obstacle, the system follows the original direction of travel but at an offset. 5. SOFTWARE The software of the system is divided into two main portions. The single board computer is loaded with the obstacle avoidance algorithm developed in C++ whereas the software developed for the microcontroller serves as an interface between the single board computer sensors and encoders. The communication between the microcontroller and the single board computer is achieved through asynchronous serial communication. The program flow is as shown in Figure 3. The single board computer sends the address of the ultrasonic sensors one by one to the microcontroller which in turn fires the sensors and sends back the distance recorded to the single board computer which is then saved in an array. The data from the infrared sensors is also transmitted to the single board computer. The combined sensor data of both the ultrasonic and infrared sensors is used to find a candidate direction of travel using a complex obstacle avoidance algorithm. Once the direction vector is found the single board computer then sends the appropriate value in degrees to the microcontroller which uses it to rotate the steerable base checking the encoder counts from the encoder mounted on the motor shaft to make sure that the desired rotation is achieved. Once obstacle avoidance is complete the user is brought back to his original line of travel. The obstacle avoidance algorithm is based on the overlapping regions created by the beam widths of the five sonar sensors. The ultrasonic sensors are placed on the sensor suite according to calculations such that overlapping regions are formed between the sensors up to a certain degree of beam width. If an obstacle is to be found within such overlapping regions then two or more sensors give the same obstacle distance reading. Using these overlapping regions the algorithm can predict the position of the obstacle in front of the user and avoid it from an appropriate direction. Also the sensors are placed in such a manner that an obstacle would always be found in one of these overlapping regions if it lies within the users path. The infrared sensors are helpful in determining the probable positions of large obstacles. After an obstacle has been avoided the user is brought parallel to his original direction of motion. To bring back the user to his original line of travel the encoder counts during obstacle avoidance from the right and left wheel encoders are used. Once obstacle avoidance is complete the steer-able base is rotated in the opposite direction

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 243 to which it was during obstacle avoidance and the encoder counts on the right and left wheel are compared with the recorded values. When the encoder counts equal the recorded values the base is brought back to its initial position as it was before obstacle avoidance. 6. RESULTS OF PILOT EXPERIMENTS The prototype was tested throughout its development phase. Five people tested the prototype. All of them were blindfolded before use. All users were physically fit and had no hearing problems. The following results were achieved after performing a series of experiments. Figure 5: Maze Arrangement The following results were achieved after performing a series of experiments: The system detects obstacle up to a size of 50mm height. The user adapts quickly to the system. The system works better on an even surface. The system is being modified to detect corridors and dead ends such as walls and an increase in the number of sensors would help do that. The combination of infrared and sonar sensors is better than sonar sensors alone. The response of the sensors is quick enough to enable the user to walk with normal speed. Figure 4: Results Achieved The Graph in Figure 4 is drawn using a sample of hundred tries, ten tries for a fixed distance between obstacles. A Maze of twenty obstacles for every ten tries was made with fixed distances between two obstacles for a particular series of experiments. An arrangement of one such maze is shown in Figure 5. 7. FUTURE WORK Various dead reckoning techniques are being tested to ensure that the system returns to its original line of travel. The system is being modified to work on different types of terrain and uneven surfaces. The team is running further tests on software. In addition, global positioning

Proceedings of the 7th WSEAS International Conference on Robotics, Control & Manufacturing Technology, Hangzhou, China, April 15-17, 2007 244 techniques are also being studied to improve the prototype. 8. CONCLUSION Realization of the prototype was achieved with the extensive consideration towards design, simulation and implementation of the problem by carrying out a series of pilot experiments. The advantages of our prototype over other navigation guides are: The ultrasonic sensors scan the environment for obstacles for the user. The information is transferred to the user through two very strong indicators: physical force. The information provided by the ultrasonic sensors of our system about the presence of obstacles is confirmed by infrared sensors. Moreover, the system merely assists the user in navigation by indicating the presence of obstacles and barriers. The final decision to travel is with the user. As a consequence, this system is easy to use and readily adaptable. ACKNOWLEDGEMENTS This undergraduate research was funded by the Sight Savers International (SSI)-The Royal Commonwealth Society for the Blind. (www.sightsavers.org) REFERENCES [1] Benjamin, J. M.; Ali, N. A.; and Schepis, A. F. 1973. A Laser Cane for the Blind. In San Diego Medical Symposium. [2] Bissit, D., and Heyes, A. 1980. An Application of Biofeedback in the Rehabilitation of the Blind. Applied Ergonomics 11(1):31 33. [3] Shoval, S.; Borenstein, J.; and Koren, Y. 1994. Mobile Robot Obstacle Avoidance in a Computerized Travel for the Blind. In IEEE International Conference on Robotics and Automation. 4] Borenstein, J., and Ulrich, I. 1994. The GuideCane A Computerized Travel Guide for the Active Guidance of Blind Pedestrians. In IEEE International Conference on Robotics and Automation. [5] Kulyukin, V., and Blair, M. 2003. Distributed Tracking and Guidance in Indoor Environments. In Conference of the Rehabilitation Engineering and Assistive Technology Society of North America (RESNA-2003). [6] Burgard, W.; Cremers, A.; Fox, D.; Hahnel, D.; G., L.; Schulz, D.; Steiner, W.; and Thrun, S. 1999. Experiences with an Interactive Museum Tour-Guide Robot. Artificial Intelligence (114):3 55. [7] Addlesee, M.; Curwen, R.; S., H.; J., N.; P., S.; and A., W. 2001. Implementing a Sentient Computing System. IEEE Computer (August):2 8. [8] Kulyukin, V. 2003. Towards Hands-Free Human-Robot Interaction through Spoken Dialog. In AAAI Spring Symposium on Human Interaction with Autonomous Systems in Complex Environments. [9] Kulyukin, V.; Gharpure, C.; and De Graw, N. 2004. A Robotic Guide for the Visually Impaired. In AAAI Spring Symposium on Interaction between Humans and Autonomous Systems over Extended Operation.