ON STAGE PERFORMER TRACKING SYSTEM

Similar documents
Intelligent Power Economy System (Ipes)

The Mathematics of the Stewart Platform

SELF STABILIZING PLATFORM

EVALUATING THE DYNAMICS OF HEXAPOD TYPE ROBOT

MDTS 5705 : Guidance Lecture 2 : Line-of-Sight Guidance. Gerard Leng, MDTS, NUS

Design and Development of Novel Two Axis Servo Control Mechanism

Design and Implementation of FPGA-Based Robotic Arm Manipulator

Range Sensing strategies

Polarization Experiments Using Jones Calculus

Voice Guided Military Robot for Defence Application

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

III. MATERIAL AND COMPONENTS USED

Wirelessly Controlled Wheeled Robotic Arm

Automobile Prototype Servo Control

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Obstacle Avoiding Robot

9. Microwaves. 9.1 Introduction. Safety consideration

An Efficient Low Cost Wiper System for Autonomous Vehicle

Robotics Challenge. Team Members Tyler Quintana Tyler Gus Josh Cogdill Raul Davila John Augustine Kelty Tobin

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN

Autonomous Machine To Manufacture PCB and 3-D Design

MULTI ROBOT COMMUNICATION AND TARGET TRACKING SYSTEM AND IMPLEMENTATION OF ROBOT USING ARDUINO

CMOS Based Compact Spectrometer

DEVELOPMENT OF A BIPED ROBOT

Control System Design for Tricopter using Filters and PID controller

Introduction: Components used:

6.1 - Introduction to Periodic Functions

2 a Shade one more square to make a pattern with just one line of symmetry.

648. Measurement of trajectories of piezoelectric actuators with laser Doppler vibrometer

ECE 511: MICROPROCESSORS

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering. (An ISO 3297: 2007 Certified Organization)

I. INTRODUCTION MAIN BLOCKS OF ROBOT

Autotracker III. Applications...

AUOTOMATIC PICK AND PLACE ROBOT

Job Sheet 2 Servo Control

Designing of a Shooting System Using Ultrasonic Radar Sensor

Satellite Dish Positioning System

Introduction Active microwave Radar

CO2 Laser Cutting System WTS4112 FALCON-S. (May 2010) Shibuya Kogyo Co.,LTD.

CRACK DETECTION SYSTEM FOR RAILWAY TRACK BY USING ULTRASONIC AND PIR SENSOR

Progress Report. Mohammadtaghi G. Poshtmashhadi. Supervisor: Professor António M. Pascoal

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

Design and Implementation of FPGA Based Quadcopter

LINE MAZE SOLVING ROBOT

RPLIDAR A2. Introduction and Datasheet. Model: A2M3 A2M4 OPTMAG. Shanghai Slamtec.Co.,Ltd rev.1.0 Low Cost 360 Degree Laser Range Scanner

Teleoperation of a Tail-Sitter VTOL UAV

Estimation of Absolute Positioning of mobile robot using U-SAT

An Electronic Eye to Improve Efficiency of Cut Tile Measuring Function

Practical Considerations for Radiated Immunities Measurement using ETS-Lindgren EMC Probes

POINTING ERROR CORRECTION FOR MEMS LASER COMMUNICATION SYSTEMS

AUTOMATIC NUMBER PLATE DETECTION USING IMAGE PROCESSING AND PAYMENT AT TOLL PLAZA

Fundamentals of Industrial Control

DATA ACQUISITION SYSTEM & VISUAL SURVEILLANCE AT REMOTE LOCATIONS USING QUAD COPTER

The Design of Battle Field Robot

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

IMU Platform for Workshops

INDOOR HEADING MEASUREMENT SYSTEM

ME375 Lab Project. Bradley Boane & Jeremy Bourque April 25, 2018

ROBOTIC ARM FOR OBJECT SORTING BASED ON COLOR

FAULT DETECTION AND DIAGNOSIS OF HIGH SPEED SWITCHING DEVICES IN POWER INVERTER

(Refer Slide Time: 00:01:31 min)

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

New Approach on Development a Dual Axis Solar Tracking Prototype

Hardware Implementation of an Explorer Bot Using XBEE & GSM Technology

Development of Drum CVT for a Wire-Driven Robot Hand

International Journal of Advance Engineering and Research Development

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

MarineBlue: A Low-Cost Chess Robot

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

RPLIDAR A2. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner. Model: A2M5 A2M6 OPTMAG. Shanghai Slamtec.Co.,Ltd rev.1.

ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED

Design and Fabrication of Automatic Wood Drilling Machine

PHYS2090 OPTICAL PHYSICS Laboratory Microwaves

Autonomous Obstacle Avoiding and Path Following Rover

Formation and Cooperation for SWARMed Intelligent Robots

INTRODUCTION. Basic operating principle Tracking radars Techniques of target detection Examples of monopulse radar systems

INTERNATIONAL JOURNAL OF MECHANICAL ENGINEERING AND TECHNOLOGY (IJMET) ANALYSIS AND CONTROL OF MOBILE ROBOT FOR PIPE LINE INSPECTION

AUTOMATED AND QUANTITATIVE METHOD FOR QUALITY ASSURANCE OF DIGITAL RADIOGRAPHY IMAGING SYSTEMS

Final Report. Chazer Gator. by Siddharth Garg

RPLIDAR A3. Introduction and Datasheet. Low Cost 360 Degree Laser Range Scanner. Model: A3M1. Shanghai Slamtec.Co.,Ltd rev.1.

Injection Molding. System Recommendations

ADVANCED SAFETY APPLICATIONS FOR RAILWAY CROSSING

ANN BASED ANGLE COMPUTATION UNIT FOR REDUCING THE POWER CONSUMPTION OF THE PARABOLIC ANTENNA CONTROLLER

Preliminary Design Review

Service Robots in an Intelligent House

Development of a Low-Cost SLAM Radar for Applications in Robotics

AN EFFICIENT APPROACH FOR VISION INSPECTION OF IC CHIPS LIEW KOK WAH

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

Hardware Implementation of Fuzzy Logic using VHDL. Vikas Kumar Sharma Supervisor : Prof. Laurent Cabaret and Prof. Celine Hudelot July 23, 2007


Robo-Erectus Jr-2013 KidSize Team Description Paper.

Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32

Product Introduction:

Due date: Feb. 12, 2014, 5:00pm 1

Embedded Robotics. Software Development & Education Center

II. MAIN BLOCKS OF ROBOT

ENGR 1110: Introduction to Engineering Lab 7 Pulse Width Modulation (PWM)

Math 3560 HW Set 6. Kara. October 17, 2013

The Real-Time Control System for Servomechanisms

Transcription:

ON STAGE PERFORMER TRACKING SYSTEM Salman Afghani, M. Khalid Riaz, Yasir Raza, Zahid Bashir and Usman Khokhar Deptt. of Research and Development Electronics Engg., APCOMS, Rawalpindi Pakistan ABSTRACT Whether it is show business, news reporting, or concert/sports telecasting. It is the Camera man who plays the key role. Besides, the focus of today s electronic industry is to produce human-centric technologies. Technologies that allow a person to perform monotonous, routine, everyday tasks and yet stay Human and not become a Robot. Technologies that can keep us from straining our nerves, those which can reduce the requirement of Skill. Because stressful life leads to Logical/Opportunistic behavior. Whereas stress-relieving, skill-free, human-friendly technologies can lead us towards Wisdom, Humanity and Peaceful living. This project is aimed to develop an automatic system for tracking the performer on the stage and capture the video. This system based upon IR tracking i.e. the position of the camera is adjusted according to performer by the help of servo motor by detecting the IR signals from the IR beacon attached with performer. Then camera will automatically capture the performance accordingly. The techniques used for tracking through IR are IR receiver polarization and their placement. It is easy to observe all the movements of performer by using the same system from different directions simultaneously without the involvement of any human. As the system is totally automatic so HR expenses decreased quite remarkably. KEYWORDS: performer tracking, IR tracking, Neuro system, IR detection I. INTRODUCTION AUTOMATIC target tracking is used in many systems in which targets are tracked using different technologies. In some cases laser designator is used to provide targeting for laser guided bombs, missiles or precision artillery munitions in which target is specified by laser beam and then target is attacked automatically. There are many other systems which uses tracking of the target. We are using this thing for peaceful purpose The On Stage performer tracking system is a mean for capturing moves of the performer or target automatically without the involvement of human. By using this we are able to exclude the duty of cameraman who has to be focused on the performer all the time which is sometimes very difficult for a human being. In tracking, time and reliability are two crucial parameters; if a tracking system is not real time then it is very difficult to detect the exact location of an object. Our system is capable of tracking the performer in almost real time. The system is reliable as well because there is no complex circuitry involved and the equipment used for implementing the system is easily available and have low cost so there is no difficulty in its maintenance. Tracking of performer have long been researched and many systems are built for this purpose. Some of them are based upon multiple sensors spread upon the stage and performer carries a transmitter [1]. The position of the performer is tracked by calculating the strength of received signal at the receiver side or sensor side. Camera is positioned towards the sensor having high signal receiving strength. Comparing the signal strength and then positioning of camera results in delayed tracking of the performer. In other performer tracking system for tracking movements of one performer upon a stage a number of a signal transmitters are placed on the stage which transmit electromagnetic signals [2]. Performers on the stage equipped with transponders which receive the electromagnetic signals and generate coded 65 Vol. 3, Issue 2, pp. 65-76

sound signals. A controller determines positions of the transponders and, hence, the position of the performers carrying the transponders is tracked. All the previous systems have used complex methodologies and techniques to detect and track the position of the performer. They are very much costly also. Sensors and transmitters used in many of these systems are specially designed so they are not available to anybody resulting into increase in maintenance and repairing cost. This research project has been sponsored and supported by APCOMS khadim hussain road Rawalpindi. II. MOTORIZED GIMBAL 2.1 Gimbal Introduction The main objective of our project was to track the performer and capture the video through the camera mounted on 2DOF servo Gimbal. We need a two degree of freedom to track a moving performer which is achieved using two servo motors [3]. One motor is used for pitch control and other second for yaw control as shown in Fig 1. Figure 1 Motorized Gimabal PCB sheet is used for mechanical design because of following advantages: --Light weight --Strength --Economical 2.2 Gimbal Geometry In two dimensions every rotation matrix is of the form [4]: cosθ sinθ R( θ ) = sinθ cosθ Where θ is angle of rotation. x cosθ sinθ x = y sinθ cosθ y Where x and y are initial positions and x and y are positions after rotation. So the coordinates (x,y ) of the point (x.y) after rotation are: x = x cosθ y sinθ This is shown in Figure 2 y = x sinθ + y cosθ (1) (2) (3) 66 Vol. 3, Issue 2, pp. 65-76

Figure 2 Rotation with angle θ In gimbal one axis is fixed and other rotates. The following three basic (gimbal-like) rotation matrices rotate vectors about the x, y, or z axis, in three dimensions: 1 0 0 R x ( θ ) = 0 cosθ sinθ 0 sinθ cosθ (4) cosθ 0 sinθ R y ( θ ) = 0 1 0 sinθ 0 cosθ (5) cosθ sinθ 0 R = z ( θ ) sinθ cosθ 0 0 0 1 (6) In equations (4), (5) and (6) one axis is fixed and other two are rotated. The example of fixing y axis fixed and rotating other two axis is shown in Figure 3. As we are using 2 DOF Gimbal which can rotate in x-axis and y-axis. So for our case z-axis is fixed. y X z III. SENSING AND POLARIZATION Figure 3 Rotation around y-axis 3.1 Introduction Sensing and polarization are the essential part of our project. In fact we have done a lot of research on these issues. In our case sensing refers to detecting the position of performer and polarization refers to setting the area of view (field of view) of IR sensors. 67 Vol. 3, Issue 2, pp. 65-76

For explaining the process of sensing first we described the method of polarization of sensors and their benefits. 3.2 Polarization Generally IR sensors or receivers have very large field of view but they require line of sight as IR cannot penetrate through hurdles [5]. IR sensors sense the IR coming from all front directions. In our case we wanted that IR sensors should detect IR from specific region and reject the other which it could normally sense. For that purpose we have polarized our sensors. For polarization [6] we use two different techniques named. --Polarization using Aluminum tubes --Polarization using Aluminum hood (caps) 3.3 Polarization using Aluminium Tubes For polarization of sensors first we use aluminum tubes. These tubes cover the photodiode of the IR sensor. The benefit of doing that the photodiode of receiver detect the IR only through the tube and as the tube path is narrow so the field of view of IR receiver got low. But the problem was that as IR can be detected in the line of sight so the field of view became less in both horizontal and vertical axis. But we wanted that field of view got less in only horizontal plane. The other disadvantages of using Aluminum tubes was that the field of view was not so precise i.e. we wanted some sensors to detect in wide area and some in very narrow area which cannot be achieved by using those tubes. The receiver looks alike as shown in Figure 4. Due to these disadvantages we opted for other techniques in which these problems can be solved. Figure 4 Polarized sensor by using Aluminium Tube 3.4 Polarization using Aluminium Hood (cap) As the results of polarization using tubes were not satisfactory so we used Aluminum caps. The caps are designed such that they cover the IR sensor IC from all the sides and had an opening at photodiode side from where sensors detect the IR. The shape was chosen such that it is slanting from front to back so that IR sensor will detect the IR in vertical direction from large area. As we wanted different field of views in horizontal axis so by changing the width of the opening we achieved that. This process is shown in graph shown in Figure 5. Figure 5 Field of view versus opening of the hood The Aluminum hood used is somewhat like shown in Figure 6. 68 Vol. 3, Issue 2, pp. 65-76

Figure 6 Aluminum hood used to polarize IR sensor In this project we required sensors having different field of views some are wide range and some are very precise in their range i.e. they can sense IR only coming from almost 25-30cm. 3.5 Sensing Sensing refers to detecting the position of the performer on the stage. Sensors are mounted on a board which is attached to servo. So by rotating servo the sensors also rotated hence their area of view is changed by servo rotation. We used two different algorithms for detection of performer on the stage. These algorithms named: --Single Eye Detection -- Three Eyes Detection 3.6 Single Eye Detection In this algorithm we used only one sensor to detect the position of performer on the stage. The sensor used is polarized such that it can only detect the IR coming from only 1foot region only. In this algorithm initially the servo is rotated such that the sensor searches for IR on the stage and when IR received by the sensor servo is stopped at that position. Servo stood still till the sensor receives IR. Whenever sensor stopped receiving IR servo motors start rotating again for the search of performer. This was the basic algorithm for tracking the performer. But it was slow and not as much accurate tracking. Flow diagram of single eye detection is shown in Figure 7. Figure 7 Single eye detection flow diagram 69 Vol. 3, Issue 2, pp. 65-76

3.7 Three Eyes Detection In this algorithm we used three sensors to detect the position of performer on the stage accurately. In this algorithm two sensors are polarized such that they can detect IR from wide range while one sensor is polarized such that it can detect the IR coming from very narrow range. They are arranged such that the narrow range sensor is in center while other two wide range sensors were on the left and right side of narrow range sensor. This is shown in Figure 8. Figure 8 Three eyes Detection sensor arrangement The wide range sensors are used to detect whether performer is on the right side or left side and the narrow range sensor or precision sensor is used to detect the exact position of the performer. Current working of the system is based upon this detection algorithm. IV. WORKING System is designed such that when there is no IR beacon detected or simply when no IR detected servo motors rotates continuously in one direction then in other and hence the sensors attached to servo. This continues until an IR is detected. This is known as scanning mode. As there are three sensors one for precision and other for direction where the performer currently is. If performer is detected on the right side sensor then servo is forced to move in anticlockwise direction, and if Start Search for IR by rotating servo No Is IR Detected? Move Servo Anticlockwise Yes Is IR detected on center sensor? No Is IR detected on Right sensor? Yes Yes No Stop Servo Move Servo clockwise Wait for few ms Figure 9 Flow diagram of system, 70 Vol. 3, Issue 2, pp. 65-76

performer is detected on the left sensor then servo is forced to rotate in clockwise. This rotation continues until center sensor detects the IR and then servo is forced to stop at that position where IR is detected by center sensor. Now if performer moves in any of the direction then it is sensed and tracked by using wide range sensor and whenever it stops servo stops. Camera is attached on a plate which is also connected to servo head through some supports above sensors. Flow diagram of the system is shown in Figure 9. V. ARTIFICIAL NEURAL NETWORK OF ON STAGE PERFORMER TRACKING SYSTEM This system consists of two microcontrollers [7]. One for sensors or IR receivers and other for generating appropriate PWM for servo motor.the neural network of on stage performer tracking system [8] is shown in Figure 10 (a). In this system y1 and y2 are the outputs of the sensor controller while z is the output of the PWM generating controller. In the Figure 10 (a) w represents the weights of the connection strength. As our system is logical so weights have only logical values i.e. it may be 0 or 1. The weights of neural network are shown in Figure 10 (b). From the neural network the outputs of the sensor controller are: y1 = w11x1 + w21x2 + w31x3 (7) y2 = w12x1 + w22x2 + w32x3 (8) We can write both equations together using a compact notation. Y=W.X (9) This equations shows that output values given by matrix product of connections strengths times the vector of input values. We can write this in matrix form as: x1 y1 w11 w21 w31 = x2 (10) y2 w12 w22 w32 x 3 Now the inputs of the PWM generating controller are: u 1 = y 1 u 2 = y 2 And the output of the PWM generating controller is: z = z 1 + z 2 + z 3 z = (w 5 u 1 + w 8 u 2 + 1) + (w 6 u 1 + w 9 u 2 ) + (w 7 u 1 + w 10 u 2 ) Where Z 1 means Servo stop Z 2 means rotate Servo anticlockwise Z 3 means rotate Servo Clockwise The possible input and output combinations of both controllers are shown in Table 1 and Table 2 VI. CONSTRUCTION AND WORKING 6.1 Construction In our final design of the system we used two microcontroller boards and a servo drive circuitry. These are shown in Figure 11. 71 Vol. 3, Issue 2, pp. 65-76

Input Layer Left sensor (P3.3) Center sensor (P3.5) Right sensor (P3.4) x 1 x 2 x 3 w 11 w 12 w 21 w 22 w 31 w 32 Sensor Controller y 1 y 2 P3.0 P3.1 w 1 w 2 w3 w 4 P3.5 u 1 P3.4 u 2 Hidden Layers -1 w 5 w 6 w 7 w 8 w 9 w 10 PWM Generating Controller z 1 z 2 z 3 Output Layer P3.0 z PWM for servo Figure 10 (a) Neural Network of On Stage Performer Tracking System One controller board is used to generate continuous PWM for Servo according to situation. While the other controller board is used to detect the outputs of sensors. Servo drive circuitry as name suggest is used to drive the servo by supplying required voltages to servo. Interconnections between controller boards are shown in Figure 12. x 1 x 2 0 1 0 0 1 0 x 3 y 1 y 2 1 0 0 1 u 1-1 0 1-1 1 0 u z 1 z 2 z 1 1 z Figure 10 (b) Weights of Neural Network of Tracking System 1 72 Vol. 3, Issue 2, pp. 65-76

Figure 11 System Design TABLE I: Input and Output Combination of Sensor Controller Senor Controller Input Output IR detected on P3.1(y 2 ) P3.0(y 1 ) Senor Center 0 0 Right 0 1 Left 1 0 TABLE 2: Input and Output Combination of PWM Generating controller PWM Generating Controller Input Output P3.5(u 1 ) P3.4(u 2 ) P3.0(z) 0 0 Stop(z 1 ) 0 1 1 0 Rotate anticlockwise(z 2 ) Rotate Clockwise(z 3 ) Block diagram of the system is shown in Figure 13. 73 Vol. 3, Issue 2, pp. 65-76

Figure 12 Interconnection between Boards IR Receiver Sensor Controller PWM Generating Controller Servo Camera Figure 13 Block Diagram of System 6.2 Working Working of the system is quite simple. Performer carries an IR beacon and the system tracked the performer by using three IR Sensors. The demonstration of project is showing Figure 14 and Schematic of IR beacon is shown in Figure 14 Figure 14 Demonstration of System 74 Vol. 3, Issue 2, pp. 65-76

Figure 15 Schematic of IR Beacon VII. DISCUSSION AND CONCLUSION In this project we achieved a very low cost tracking system by using IR sensors and a servo motor. Robot localization is the key problem in making truly autonomous robots. If a robot does not know where it is, it can be difficult to determine what to do next. Our system can be installed on some location and can be modified a little to help the robots in localization. We have made system for tracking in one performer on the stage at a time having any powerful IR transmitter. But it can be modified to track multiple performers having different IR beacons but one at a time. It means that only one performer will be tracked at a time from the multiple performers having specific beacon. For that purpose RC5 protocol can be used. APPENDIX (CODE ALGORITHM) Begin; Center sensor?; If(center sensor=1) Stop servo; Else { Right sensor?; If(Right sensor=1) Rotate Clockwise; Else Rotate anticlockwise; } Jump Begin ACKNOWLEDGMENT All thanks to Almighty ALLAH who strengthens us to complete this project and blessed to have a fabulous network of people. We would like to thank our supervisor Dr. Salman Afghani for his patience, guidance and encouragement throughout the duration of this project and we look forward to future correspondence with him. We would also like to thank the other students and teachers who have helped in this project by any means. Special thanks to Sir Ijlal for his support and encouragement. REFERENCES [1] Systems and methods for tracking single source(us patent # 4067015) [2] Tracking system (US patent # 5504477) [3] http://www.electricmotors.machinedesign.com/guiedits/content/bdeee4a/bdeee4a_1.aspx [4] http://en.wikipedia.org/wiki/rotation_matrix 75 Vol. 3, Issue 2, pp. 65-76

[5] http://www.ladyada.net/learn/sensors/ir.html [6] BOOK : Electromagnetic Theory By U.A.Bakshi, A.V.Bakshi [7] http://www.atmel.com/images/doc0368.pdf [8] http://www.pirobot.org/blog/0005/ Authors Salman Afghani was born in Pakistan in 1958. Professor, PhD, Advanced man machine systems, MPhil, Industrial automation, MS Mech Engg. Research Interest: Man Machine systems, robotics intelligent all terrain vehicles & submersibles, Zoological and Botanical computers, Non-intrusive botanical genetics using computer simulated time acceleration technique. Khalid Riaz was born in Rawalpindi, Pakistan in 1990. He did BSEE in Electrical Engineering from University of Engineering and Technology, Taxila Pakistan. Yasir Raza was born in Rawalpindi, Pakistan in 1989. He did BSEE in Electrical Engineering from University of Engineering and Technology, Taxila Pakistan. Zahid basher was born in Rawalpindi, Pakistan in 1988. He did BSEE in Electrical Engineering from University of Engineering and Technology, Taxila Pakistan. Usman Khokhar was born in Rawalpindi, Pakistan in 1988. He did BSEE in Electrical Engineering from University of Engineering and Technology, Taxila 76 Vol. 3, Issue 2, pp. 65-76