Volume 118 No. 24 2018 ISSN: 1314-3395 (on-line version) url: http://www.acadpubl.eu/hub/ http://www.acadpubl.eu/hub/ GESTURE BASED HOME AUTOMATION SYSTEM USING SPARTAN 3A, ASIC 1 K.MADHAVA RAO, 2 BATTULA HARITEJA, 3 PATLOLLA SHRUTHI, 4 ROOPA NOMULA, 5 PREETI KATASANI, 1 Assistant Professor, 2 Student, 3 Student, 4 Student, 5 Student Department of electronics and commmunication engineering B V RAJU INSTITUTE OF TECHNOLOGY April 28, 2018 Abstract Applications in robotics are increasing day-by-day. Gesture controlled robot is a robot which can be controlled by simple gestures. The user just need to give the signal to the gesture board and the gesture board will sent information to Spartan 3A FPGA. The robot will move according to the signals given to the Spartan 3A which is user defined. The wired communication enables the user to interact with the robot in a more friendly way. Key Words:Gesture reorganization, Spartan, FPGA programmer, JTag, USB cable, FPGA, Xilinx, Synopsis 1 Introduction Present booming field in this generation is robotics [2]. This robot implementation is very useful in future. Even though robot may replace human beings in some area, but to control a robot we need a 1
human being. So there is much more scope for jobs in robotics field. This robot is controlled by controlling device which can be wired or wireless. Any system has pros and cons. In the same way, Robotics field also having pros and cons. From recent few years gesture control has became very popular in robotics field. This gesture control is more natural way to control the robot. Interaction with gesture and robotic system has become simple and easy. Many wireless and wired systems are developed for communication, medical, navy etc for different applications. The objective of this paper is to build a wired gesture control robot using Gesture board, Spartan 3A, motor driver and motors. Gesture board reads the input signal from user. The gesture board generates electromagnetic waves up to the height of 10 cm. When user gives direction inside the gesture board E-fields going to generate in one direction and current is passed in the generated path. These wave moves to same direction according to the user movement. GEST IC MGC3130 will control the signal. These IC convert analog signal into two bit digital signal. This digital signal will be input to FPGA [5]. According the user define code these FPGA [5] will control the motor drive. 2 LITERATURE REVIEW SwarnPrabha and SworajKumar have proposed Accelerometer gesture control robot [2] in which transmitter transmits the signal according to the position of accelerometer and hand gesture and the receiver receives the signal and makes the robot move in respective direction. The main aim to built gesture control robot is to have interaction between computer and human beings. The gestures which are recognized are used to control the robot, so that we can covey meaning - full information. Man-machine interaction is also called as human computer interaction this refer to a relation among computer and human. Usability and functionality are the two main things that are important to design any kind of system. Many reviews say that gesture recognition system applications are growing very rapidly and became important in our day-to-day life, mainly in the field like robot control, games etc. This project 2
makes the advancement of the gesture systems, with the using of different levels required to build the total system. 3 PROPOSED SYSTEM Using FPGA [5] we had designed gesture control robot. Transmitted signals are provided by IC MGC3130 this is used to generate the electric field in the analog signal form on other end there are receiving electrodes and these electrodes process the data in the digital from and these are worked by the hand movement of the user. Fig 1: Block Diagram of gesture control robot Fig 2 Gesture Control Robot hardware Figure contains of spartan 3A, gesture board and robot. This is total hardware and external view of the project. B Interface spartan 3A and Motor driver PWM program is processed by the Spartan FPGA board and the output of PWM is given to L293D enable pin. This is going to activate the L293 high current quadruple half H-driver chip and this is going to control the motor speed. The below truth table 3
is shown in Table 1 to get L293 different movement operation like backward, forward, right, left[1]. Table 1: L293 chip to active the input C Flowchart of the project Fig 3 Flow Chart Figure 3 explains the overall flow of the project. First gesture recognition [3] is done by the board, through which input was taken from the user by hand movement. This E-signal is converted into 2 bit digital form and given to Spartan FPGA board as an input. If the gesture direction is not same as robot movement and then FPGA will wait for next input from gesture. If the input of gesture direction is similar to robot then the robot changes its direction according to the direction. 4 RESULTS A XILINX IMPLEMENTATION[6] 4
Fig 4: Gesture control robot wave forms in Xilinx [6] Figure 4 describe different inputs and output.according to the given direction the robot need to move. So this waveform is going to justify the output before we are going to implement it on the hardware. Fig 5: RTL schematic[6] of Gesture control robot Simulation result is converted into gate level by using RTL schematic[6 5
Fig 6: Total area used by the gesture robot Slice Logic Utilization: Number of Slice LUTs: 2 outof46560 0% Number usedaslogic: 2 outof 46560 0% Slice Logic Distribution: Number of LUT Flip Flop pairs used: 2 Number with an unused Flip Flop: 2 out of 2100% Number with an unused LUT: 0 out of 2 0% Number of fully used LUT-FF pairs: 0 out of 2 0% Number of uniquecontrolsets: 1 IO Utilization: NumberofIOs: 7 Number of bonded IOBs: 7out of 240 2% IOBFlipFlops/Latches: 4 Specific Feature Utilization: Number of BUFG/BUFGCTRLs 1 out of 32 3% B. ASIC IMPLEMENTATION Fig 7: Gesture control robot waveform using VCS Compiler 6
Fig 8: RTL schematic of gesture control robot using design compiler AREA REPORT Fig 9: Physical Design Number of ports: 8 Number of nets: 9 Number of cells: 6 Number of combinational cells: 2 Number of sequential cells: 4 Number of macros/black boxes: 0 Number of buf/inv: 2 Number of references: 2 Combinationalarea: 11.059200 Buf/Invarea: 11.059200 Non Combinational area : 99.532799 Macro/Black Box area : 0.000000 Net Interconnect area : 12.615360 Total cell area : 110.591999 7
Total area : 123.207359 SUMMARY Simple coding of Verilog in project describes the design of gesture control robot. In the design it describes map the input gesture Spartan 3A FPGA which is going to control the motion of robot. These design is successfully configured with Spartan3AFPGA. ASIC and FPGA are the two methods that are used in the project. We have design in four phases that are included in the project. All these phases are implemented both in Xilinx and synopsis tool, and we get accurate result in synopsis tool. POWER REPORT Global Operating Voltage = 1.2 Power-specific unit information : Voltage Units = 1V Capacitance Units = 1.000000ff Time Units = 1ns Dynamic Power Units = 1uW Leakage Power Units = 1pW Cell Internal Power =4.5777nW (91%) Net Switching Power =471.8869pW (9%) TotalDynamicPower = 5.0496 nw (100%) CellLeakagePower = 1.7807uW Internal Switching Leakage TotalPowerGroup Power Power Power Power (% )Attrs sequential 3.8989e-03 6.9867e-07 1.6832e+06 1.6871 (94.48 combinational 6.7879e-04 4.7119e-04 9.7474e+04 9.8624e-02 (5.52 Total 4.5777e-03uW 4.7189e-04uW 1.7807e+06pW1.7857uW 5 CONCULSION Fast and easy algorithm for controlling a robot with hand gesture was proposed. We have acquired real images for efficient algorithm. Limited number of gestures are considered in our system. To recognize border set of gesture we can extend in many number ofways. To tuning the music player and to change the channel of television. By interfacing the PC more applications can be controlled with single gesture board such as windows player, game controllers and for social networks.size of the gesture board will be minimized 8
as compact and portable one. Give rises to the touch less gesture smart phones. For handicapped people [4] we can design wheelchair which can move with your gesture signals. References [1]. Ms. Shilpa Kale Mr. S. S. Shriramwar FPGA- based Controller for a Mobile Robot in (IJCSIS) International Journal of Computer Science and Information Security, Vol. 3, No. 1, 2009. [2]. SwarnaPrabha Jena, Sworaj Kumar Nayak, Accelerometer based gesture controlled robot using arduino in INTERNA- TIONAL JOURNAL OF ENGINEERING SCIENCES & RE- SEARCH TECHNOLOGY, ISSN: 2277-9655Scientific Journal Impact Factor: 3.449 (ISRA), Impact Factor: 2.114[Jena, 4(4): April,2015]. [3]. Anurag Mishra, Pooja Makula, Akshay Kumar, Krit Karan, V. K. Mittal, Multi-modal controls of a smart robot, India Conference (INDICON) 2015 Annual IEEE, pp. 1-6, 2015, ISSN2325-9418. [4].Miguel Rivera-Acosta, Susana Ortega-Cisneros, Jorge Rivera, Federico Sandoval-Ibarra, American Sign Language Alphabet Recognition Using a Neuromorphic Sensor and an Artificial Neural Network, Sensors, vol. 17, pp. 2176, 2017, ISSN 1424-8220. [5]. Syed Tahir Hussain Rizvi, GianpieroCabodi, Denis Patti, Muhammad Majid Gulzar, Comparison of GPGPU based robotic manipulator with other embedded controllers,development and Application Systems (DAS) 2016 International Conference on, pp. 10-15, 2016. [6].N.Naveenkumar,Implementation of gesture recognition system for home automation using FPGA and ARM controller in International Journal of Science and Research (IJSR) ISSN (Online): 2319-7064 Index Copernicus Value (2013): 6.14 Impact Factor (2013):4.438. 9