Computer Vision Based Real-Time Stairs And Door Detection For Indoor Navigation Of Visually Impaired People

Similar documents
Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

A Survey on Assistance System for Visually Impaired People for Indoor Navigation

A RASPBERRY PI BASED ASSISTIVE AID FOR VISUALLY IMPAIRED USERS

Indoor Navigation Approach for the Visually Impaired

Autonomous Localization

Electronic Travel Aid Based on. Consumer Depth Devices to Avoid Moving Objects

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Research on an Economic Localization Approach

Assisting and Guiding Visually Impaired in Indoor Environments

Simulation of a mobile robot navigation system

Smart Navigation System for Visually Impaired Person

Ultrasound-Based Indoor Robot Localization Using Ambient Temperature Compensation

Design and Development of Blind Navigation System using GSM and RFID Technology

LATERATION TECHNIQUE FOR WIRELESS INDOOR POSITIONING IN SINGLE-STOREY AND MULTI-STOREY SCENARIOS

Azaad Kumar Bahadur 1, Nishant Tripathi 2

Wheeler-Classified Vehicle Detection System using CCTV Cameras

Outdoor Navigation Systems to Promote Urban Mobility to Aid Visually Impaired People

Portable Monitoring and Navigation Control System for Helping Visually Impaired People

An Adaptive Indoor Positioning Algorithm for ZigBee WSN

Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System

MULTIPATH EFFECT MITIGATION IN SIGNAL PROPAGATION THROUGH AN INDOOR ENVIRONMENT

SMART VIBRATING BAND TO INTIMATE OBSTACLE FOR VISUALLY IMPAIRED

Journal of Mechatronics, Electrical Power, and Vehicular Technology

3D ULTRASONIC STICK FOR BLIND

Indoor Localization and Tracking using Wi-Fi Access Points

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS

4D-Particle filter localization for a simulated UAV

Fig Color spectrum seen by passing white light through a prism.

Indoor Location System with Wi-Fi and Alternative Cellular Network Signal

INTELLIGENT WHITE CANE TO AID VISUALLY IMPAIRED

Measurement report. Laser total station campaign in KTH R1 for Ubisense system accuracy evaluation.

ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED

Virtual Eye for Blind People

Ani Nahapetian California State University, Northridge Los Angeles, CA

A Vehicle Speed Measurement System for Nighttime with Camera

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

Virtual Tactile Maps

Blind navigation support system based on Microsoft Kinect

Haptic presentation of 3D objects in virtual reality for the visually disabled

International Journal of Pure and Applied Mathematics

Vision System for a Robot Guide System

Development of Visually Impaired Guided System Using GPS, Sensors and Wireless Detection

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

The Hand Gesture Recognition System Using Depth Camera

A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions

AN UNIQUE METHODOLOGY ENABLING BUS BOARD NAVIGATING SYSTEM USING WSN

Real Time Indoor Tracking System using Smartphones and Wi-Fi Technology

An Improved Bernsen Algorithm Approaches For License Plate Recognition

Technology offer. Aerial obstacle detection software for the visually impaired

Automated Driving Car Using Image Processing

Intelligent Nighttime Video Surveillance Using Multi-Intensity Infrared Illuminator

Pedestrian Navigation System Using. Shoe-mounted INS. By Yan Li. A thesis submitted for the degree of Master of Engineering (Research)

Simulation Analysis for Performance Improvements of GNSS-based Positioning in a Road Environment

International Journal OF Engineering Sciences & Management Research

Color Constancy Using Standard Deviation of Color Channels

MULTIPLE SENSORS LENSLETS FOR SECURE DOCUMENT SCANNERS

The Seamless Localization System for Interworking in Indoor and Outdoor Environments

License Plate Localisation based on Morphological Operations

Visual compass for the NIFTi robot

Robot Visual Mapper. Hung Dang, Jasdeep Hundal and Ramu Nachiappan. Fig. 1: A typical image of Rovio s environment

Intelligent Tactical Robotics

Combined Approach for Face Detection, Eye Region Detection and Eye State Analysis- Extended Paper

UWB RFID Technology Applications for Positioning Systems in Indoor Warehouses

Remote PED Assistant. Gabriel DeRuwe. Department of Electrical & Computer Engineering

idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology

Intelligent Traffic Sign Detector: Adaptive Learning Based on Online Gathering of Training Samples

State and Path Analysis of RSSI in Indoor Environment

NFC Internal: An Indoor Navigation System

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

Automatic License Plate Recognition System using Histogram Graph Algorithm

An Indoor Navigation Aid Designed for Visually Impaired People

Part 1: Determining the Sensors and Feedback Mechanism

FreeNavi: Landmark-based Mapless Indoor Navigation based on WiFi Fingerprints

Indoor Navigation for Visually Impaired / Blind People Using Smart Cane and Mobile Phone: Experimental Work

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

SMART RFID FOR LOCATION TRACKING

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Mobile Cognitive Indoor Assistive Navigation for the Visually Impaired

Indoor Positioning with a WLAN Access Point List on a Mobile Device

THE IMPLEMENTATION OF INDOOR CHILD MONITORING SYSTEM USING TRILATERATION APPROACH

Indoor Topology Navigation System for Robotic Control

Medium Access Control via Nearest-Neighbor Interactions for Regular Wireless Networks

Annotation Overlay with a Wearable Computer Using Augmented Reality

EFFECTIVE NAVIGATION FOR VISUALLY IMPAIRED BY WEARABLE OBSTACLE AVOIDANCE SYSTEM

Initial Report on Wheelesley: A Robotic Wheelchair System

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

Intelligent Vehicle Localization Using GPS, Compass, and Machine Vision

Portable Monitoring and Navigation Control System for Helping Visually Impaired People

Blind navigation with a wearable range camera and vibrotactile helmet

Controlling Humanoid Robot Using Head Movements

SCIENCE & TECHNOLOGY

OPEN CV BASED AUTONOMOUS RC-CAR

Cooperative Tracking using Mobile Robots and Environment-Embedded, Networked Sensors

Image Processing and Particle Analysis for Road Traffic Detection

Agenda Motivation Systems and Sensors Algorithms Implementation Conclusion & Outlook

Re: ENSC 440 Project Proposal for an Ultrasonic Local Positioning System

We Know Where You Are : Indoor WiFi Localization Using Neural Networks Tong Mu, Tori Fujinami, Saleil Bhat

Huawei Indoor WLAN Deployment Guide

Learning and Using Models of Kicking Motions for Legged Robots

Transcription:

ISSN (e): 2250 3005 Volume, 08 Issue, 8 August 2018 International Journal of Computational Engineering Research (IJCER) For Indoor Navigation Of Visually Impaired People Shrugal Varde 1, Dr. M. S. Panse 2 1Research scholar, VeermataJijabai Technological Institute, Electrical Engineering Department, Mumbai, 2Professor, VeermataJijabai Technological Institute, Electrical Engineering Department, Mumbai Corresponding Author: Shrugal Varde ABSTRACT: Door and stairs are significant landmarks for indoor navigation. One of the existing methods to perform indoor navigation is using a topological map in combination with a device that can help visually impaired person commute. Computer vision based navigation aid can improve the mobility of visually impaired person in known or unknown, indoor or outdoor situations. In this work, we propose a computer vision based method to detect doors and stairs for indoor navigation. The images provided my stereo cameras are used to extract information about the segments that belong to the stairs and doors. Several concepts are defined as size, the distance between the segments and many more that can help distinguish doors and stairs from similar structures found in indoor environments. The proposed work has proved to successfully detect doors and stairs under strong perspective deformation. Furthermore according to our experiments this method is suitable for real-time stairs and door detection. KEYWORDS: Indoor navigation, visually impaired, computer vision, doors, stairs. ----------------------------------------------------------------------------------------------------------------------------- ---------- Date of Submission: 08-08-2018 Date of acceptance: 23-08-2018 ----------------------------------------------------------------------------------------------------------------------------- ---------- I. INTRODUCTION In an unfamiliar environment, we spot and explore all the available information whichmight guide us to a desiredlocation. Finding office or room in a building or shopping centeris often a simple task for vision people. But this task is quite difficult for visually impairedpeople[16]. Vision impairment is often due to conditions such as diabetic retinopathy, musculardegeneration, and glaucoma. The visually impaired(vi) community is very diverse in terms of degree of vision loss, age, and abilities. Vision loss affects almost every activity of daily working. Walking, driving, reading, recognizing objects, people, places becomes difficult or almost impossible without vision. Lack of mobility is a severe concern for blind people. They find it extremely difficult to travel independently as they cannot determine their position and orientation in the surroundingenvironment.they often have to rely on people passing by to ask information or if the building isfamiliar they need to memorize all the locations. [1]World Health Organization(WHO) carried out a survey in 2010 to estimate a totalnumber of visually impaired (VI) people in the world. According to this report, out of 6737 million people in the world, 285 million people are visually impaired. Out of 285 million VI population, approximately 39 million people are completely blind and 246 million people have lowvision. Figures clearly indicate that there is a need for advanced assistive aid that can help VI community. Human mobility comprises orientation and navigation [3]. Orientation can be thought of as knowledge of one s position with respect to the objects in the environment. Information about the position, route planning is linked with orientation. Navigation is the ability to move within the local environment. This involves information about stationary or moving obstacles in the surrounding, features of the floor etc.two important indoor structure which playa vital role in indoor navigation is stairs and doors. It is very difficult for blind people to detect location of stairswith just a white cane or location of door. Electronic travel aids (ETA) have been designed for VI people which convert visual information into auditory or haptic feedback. There are many ETA s based on ultrasound and infra-red technologyfor outdoor navigation which gives information to the user about probable obstacles in front ofthem [4,5]. GPS is also used in few outdoor navigation aids that can help visually impaired peopleto reach to their destination. But this technology cannot be used for indoor navigation. Due www.ijceronline.com Open Access Journal Page 65

to narrow corridors and many structures, indoor navigation is more challenging then outdoor navigation.there are many travel aid prototypes developed for indoor navigation[6]. Most the indoortravel aids are based on RFID technology, Bluetooth, wireless signal technology.roshni[9],an indoor electronic travel aid developed at IIT Delhi uses infra-red wayfinding technology.the prototype downloads the floor plan of the building. It locates and tracks the user insidethe building, finds the shortest path and provides step by step direction.[10]kinect based doordetection uses a single RGB camera and infrared depth sensor to find if the doors are presentin front of the user. A prototype antenna was developed which is fitted in the door casing. Theradiation pattern shape is similar to a doughnut. The user has a receiver which continuouslymonitors the energy of from the antenna. The prototype guides the person to move in the the direction which has maximum energy.guidecane[2] uses 8 ultrasound sensors to find the obstacle in front of the user. Two of thesesensors are used to detect stairs. It can detect two upward stairs and two downward stairs. Another prototype uses RFID tags. These tags are placed on each stair. The user has a device that can read the RFID tags and the device gives the useranaudiofeedback to inform the user whether they are upward stairs or downwardstairs.all the above technology used for doors and stairs detection are different from the technologyused for outdoor navigation. The cost of the installment of these devices and the maintenance of thesedevices is high. Moreover, there are two devices that user has to use, one for outdoor and other for indoor navigation. The present work proposes an algorithm based on computer vision that can help detect doors and stairs for indoor navigation. The system captures images at a resolution of 1024x768. These images are given as input to the processing unit. There are two separate modules for door detection and stair detection. Both these modules extract the information about the presence or absence of doors and stairs in captured images and give an auditory feedback to the user of the device informing about the location of the doors and/or stairs. Paper is organized as follows. Section 2 explains the concept of door detection using computer vision. Section 3 presents the algorithm for stair detection using computer vision. Section 4 presents and discusses experiments and results obtained. Section 5 concludes with final remarks. II. DOOR DETECTION USING COMPUTER VISION The major characteristic of any door that distinguishes it from other obstacles and indoor structures are its edges. Doors are made up of four edges. Angle between the adjacent two edges of the door is always 90. There are two types of door edges as shown in the figure. Figure 2.1 Types of doors They are called single frame doors and double frame doors respectively. Edges of single frame door are the edges between the door and the wall. The double frame door contains two sets of edges, outer edges are the edges between the frame of the door and the wall and the inner edges are the edges between the frame and the door. Due to the perspective projection of the environment on the 2D image plane, the angles between the edges do not always appear as 90. Hence angle feature cannot be used to detect if a door is present close to the user or not. But the property that two vertical edges will always be parallel to each other and two horizontal edges www.ijceronline.com Open Access Journal Page 66

between the two vertical edges can help to detect doors. To distinguish between doors and objects with similar properties like cupboards, we put a limitation on the minimum length of vertical edges and limits on the distance between two detected horizontal edges.the algorithm makes the following assumptions 1. The doors are not made up of transparent material like glass. 2. Both the vertical edges of the door should be visible. 3. Atleast one horizontal edge on the door should be visible. 4. Doors in the image have a certain width and height. Figure 2.2 Corners and edges of door As shown in the figure 2.2 C1, C2, C3, C4 are corners of the door and E12, E23, E34, E41 are the edges of the door. Let D be the diagonal of the image. The ratio of the edge length and diagonal distance D is given by the equation 2.1 and angle of the edge is given by the equation 2.2 Vertical edges and horizontal edges of the door should have a certain height and width (2.1) (2.2) respectively. The flowchart of the algorithm is as given below (2.3) III. STAIR DETECTION USING COMPUTER VISION Stairs are one of the most important structures in indoor navigation. Like doors, stairs can be distinguished from other obstacles by its unique property that all the edges of stairs are parallel.as shown in figure 3.1, stairs are characterized by one more feature, they have the parallel edges bounded between two diagonal lines or one diagonal edge. www.ijceronline.com Open Access Journal Page 67

Figure 3.1 Stairs with horizontal edges and diagonal edges The above shown features are similar to other structures like book rack found during indoor navigation. To distinguish between stairs and other similar structure, we put limits on extracted features from stereo images. Let nbe the total number of horizontal lines in the given image, a v be the angle of diagonal lines, a h be the angle of horizontal lines and let dbe the maximum distance between two consecutive horizontal lines. For the algorithm to recognize stairs: (3.1) The value of n max, n min, d max and d min are depended on the distance between the cameras and the stairs. The flowchart of door detection algorithm and stair detection algorithm is as shown below (3.2) www.ijceronline.com Open Access Journal Page 68

IV. EXPERIMENTS AND RESULTS Experimentation process for door and stair detection algorithm was performed in two stages. Stage 1 experiments included studying a number of images with doors and stairs and images having objects with structures similar to doors and stairs. This stage of experimentation was used to decide the limiting values required to classify doors and non-doors, stairs and non-stairs. For this stage, our hardware prototype captured 60 images. Out of these 60 images, 20 images contained doors, 20 contained stairs, 10 contained images of the cupboard as they have a structure similar to doors and 10 images contained book rack as their structure is similar to stairs. For all these images, the distance between objects and camera system was at least 3 meters. At this distance, it was observed that for a vertical edge to be qualified as the edge of the door, the height of the vertical edge should be between 520 pixels to 600 pixels. Similarly, for the horizontal to be qualified as the edge of the door, the horizontal edge should have length should be between 250 pixels to 320 pixels. In case of stair detection, the distance between two parallel lines should be between 10 pixels to 50 pixels and the number of parallel lines should be between 7 to 20. Stage two experiments were actual validation experiments. In validationprocess,the training set contained 230 images. 90 images were used to validate the door detection module. These 90 images were a mixture of images containing doors and no doors. 140 images were used to validate the stair detection module. The obtained results are given in the following table Number of images for Identified the object as door or Identified the object as Percentage accuracy of the validation stairs bookshelf or cupboard modules 50 images with the door 38 12 76.00% 40 images without a door 3 37 92.50% 70 images with stairs 49 21 70.00% 70 images with no stairs 5 65 92.80 % The figure 4.1 and figure 4.2 shows detected doors and stairs. www.ijceronline.com Open Access Journal Page 69

V. CONCLUSION AND FUTURE SCOPE In this paper, we have presented the door and stair detection module that can provide information about the location of stairs and doors to a visually impaired person. The time required for the door detection and stair detection module to process the images, extract information about stairs and doors from the image and give auditory feedback to the user about the presence or absence of stairs or doors is 0.15 seconds and 0.2 seconds respectively. Hence the software module can work at a frame rate of 5 frames per second. The accuracy of door detection of the door detection modules I approximately 75 % and that of stair detection modules is 70 %. There is a vast scope for improvement in the design. In the near future, we would like to extend the possibility of using a support vector machine to better classify the doors and stairs. Depth information generated by the stereo cameras used for stairs and door detection can also help to further classify stairs as upward or downward stairs. REFERENCES [1]. Global data on visually impairment. World Health Organization, 2010 [2]. L. Dunai, I. LenguaLengua, I. Tortajada, and F. Brusola Simon, Obstacle detectors forvisually impaired people, in International Conference on Optimization of Electrical andelectronic Equipment (OPTIM), pp. 809 816, May 2014 [3]. R. Jafri, S. A. Ali, H. R. Arabnia, and S. Fatima, Computer vision-based objectrecognition for the visually impaired in an indoors environment: a survey, The VisualComputer, vol. 30, no. 11, pp. 1197 1222, 2014 [4]. X. Lu and R. Manduchi, Detection and localization of curbs and stairways using stereovision, in Proceedings of the 2005 IEEE International Conference on Robotics andautomation, pp. 4648 4654, April 2005. [5]. S. Wang and H. Wang, 2d staircase detection using real AdaBoost, in 2009 7 th International Conference on Information, Communications and Signal Processing(ICICS), pp. 1 5, Dec 2009. [6]. Y. H. Lee, T. S. Leung, and G. Medioni, Real-time staircase detection from awearable stereo system, in Proceedings of the 21st International Conference on PatternRecognition (ICPR2012), pp. 3770 3773, Nov 2012 [7]. RGB-d image-based detection of stairs, pedestrian crosswalks and traffic signs, Journalof Visual Communication and Image Representation, vol. 25, no. 2, pp. 263 272, 2014 [8]. Stairs detection with odometry-aided traversal from a wearable RGB-d camera, ComputerVision and Image Understanding, vol. 154, pp. 192 205, 2017 [9]. D. Jain, Path-guided indoor navigation for the visually impaired using minimal buildingretrofitting, in Proceedings of the 16th International ACM SIGACCESS Conference oncomputers and Accessibility, pp. 225 232, 2014 [10]. Y. Zhou, G. Jiang, G. Xu, X. Wu, and L. Krundel, Kinect depth image based doordetection for autonomous indoor navigation, in The 23rd IEEE International Symposiumon Robot and Human Interactive Communication, Scotland, UK,, pp. 147 152, 2014. www.ijceronline.com Open Access Journal Page 70