Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Similar documents
The Making of a Kinect-based Control Car and Its Application in Engineering Education

Available online at ScienceDirect. Procedia Computer Science 76 (2015 ) 2 8

A Study on Motion-Based UI for Running Games with Kinect

Gesture Recognition with Real World Environment using Kinect: A Review

Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Training Schedule. Robotic System Design using Arduino Platform

Mixed / Augmented Reality in Action

INTRODUCTION OF SOME APPROACHES FOR EDUCATIONS OF ROBOT DESIGN AND MANUFACTURING

Image Manipulation Interface using Depth-based Hand Gesture

KINECT CONTROLLED HUMANOID AND HELICOPTER

Gesture Controlled Car

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

3-Degrees of Freedom Robotic ARM Controller for Various Applications

A Micro Scale Measurement by Telecentric Digital-Micro-Imaging Module Coupled with Projection Pattern

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Toward an Augmented Reality System for Violin Learning Support

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

YDLIDAR G4 DATASHEET. Doc#: 文档编码 :

Team Project: A Surveillant Robot System

Available online at ScienceDirect. Procedia Technology 17 (2014 )

ScienceDirect. An Integrated Xbee arduino And Differential Evolution Approach for Localization in Wireless Sensor Networks

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Robotics & Embedded Systems (Summer Training Program) 4 Weeks/30 Days

Development of a telepresence agent

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

Embedded & Robotics Training

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

Vein and Fingerprint Identification Multi Biometric System: A Novel Approach

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Rockets, Robots, Hovercraft, and Quadracopters, all for the STEM of IT! John J. Helferty Temple University

PCB & Circuit Designing

Gesticulation Based Smart Surface with Enhanced Biometric Security Using Raspberry Pi

Available online at ScienceDirect. Procedia Computer Science 105 (2017 )

WifiBotics. An Arduino Based Robotics Workshop

II. LITERATURE REVIEW

Arduino STEAM Academy Arduino STEM Academy Art without Engineering is dreaming. Engineering without Art is calculating. - Steven K.

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Wirelessly Controlled Wheeled Robotic Arm

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

Design of Removable Guardrail System Based on ZigBee Network

Controlling Humanoid Robot Using Head Movements

PCB & Circuit Designing (Summer Training Program) 6 Weeks/ 45 Days PRESENTED BY

RFBee User Manual v1.0

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Design of intelligent vehicle control system based on machine visual

Based on the ARM and PID Control Free Pendulum Balance System

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

ScienceDirect. Optimal Placement of RFID Antennas for Outdoor Applications

BEYOND TOYS. Wireless sensor extension pack. Tom Frissen s

The Design of Intelligent Wheelchair Based on MSP430

ARDUINO / GENUINO. start as professional. short course in a book. faculty of engineering technology

Short Course on Computational Illumination

PCB & Circuit Designing (Summer Training Program 2014)

A Model Based Approach for Human Recognition and Reception by Robot

Journal of Mechatronics, Electrical Power, and Vehicular Technology

A Dynamic Fitting Room Based on Microsoft Kinect and Augmented Reality Technologies

Automation and Mechatronics Engineering Program. Your Path Towards Success

II. MAIN BLOCKS OF ROBOT

Embedded & Robotics Training

Research and application on the smart home based on component technologies and Internet of Things

BOAT LOCALIZATION AND WARNING SYSTEM FOR BORDER IDENTIFICATION

ADVANCED SAFETY APPLICATIONS FOR RAILWAY CROSSING

Voice Guided Military Robot for Defence Application

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

TurtleBot2&ROS - Learning TB2

YDLIDAR F4PRO DATASHEET

Implementation of a Self-Driven Robot for Remote Surveillance

International Journal for Research in Applied Science & Engineering Technology (IJRASET) DTMF Based Robot for Security Applications

XBee based Remote-Controllable and Energy-Saving Room Architecture

Creative laboratory Fabulous Transylvania - Academy Pro_Gojdu - concept for sustainable development and economic recovery -

Computational Crafting with Arduino. Christopher Michaud Marist School ECEP Programs, Georgia Tech

A CMOS Visual Sensing System for Welding Control and Information Acquirement in SMAW Process

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Available online at ScienceDirect. Procedia Engineering 111 (2015 )

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Computer Control with Hand Gestures Using Ultrasonic Sensor

Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

Embedded Robotics. Software Development & Education Center

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

Development of excavator training simulator using leap motion controller

Journal of Chemical and Pharmaceutical Research, 2013, 5(12): Research Article. An intelligent flight chess robot design and implementation

Available online at ScienceDirect. Procedia Computer Science 56 (2015 )

Available online at ScienceDirect. Procedia Computer Science 75 (2015 )

Separately Excited DC Motor for Electric Vehicle Controller Design Yulan Qi

Hardware Implementation of an Explorer Bot Using XBEE & GSM Technology

Design of WSN for Environmental Monitoring Using IoT Application

RESEARCH AND DEVELOPMENT OF DSP-BASED FACE RECOGNITION SYSTEM FOR ROBOTIC REHABILITATION NURSING BEDS

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

I. INTRODUCTION MAIN BLOCKS OF ROBOT

Devastator Tank Mobile Platform with Edison SKU:ROB0125

AR 2 kanoid: Augmented Reality ARkanoid

Transcription:

Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses Ke-Yu Lee a, Jhih-Sian Jheng a, Shiang-Chih Chen a, Shi-Jer Lou b * a Associate Professor, Department of Computer Science and Information Engineering, Cheng Shiu University, Taiwan b Professor, Graduate Institue of Technological and Vocational Education, National Pingtung University of Science and Technology, Taiwan Abstract This paper describes the fabrication of Kinect remote-controlled cars, using PC, Kinect sensor, interface control circuit, embedded controller, and brake device, as well as the planning of motion interaction courses. The Kinect sensor first detects the body movement of the user, and converts it into control commands. Then, the PC sends the commands to Arduino control panel via XBee wireless communication modules. The interface circuit is used to control movement and direction of motors, including forward and backward, left and right. In order to develop the content of Kinect motion interaction courses, this study conducted literature review to understand the curriculum contents, and invited experts for interviews to collect data on learning background, teaching contents and unit contents. Based on the data, the teaching units and outlines are developed for reference of curriculums. 2015 2014 The Authors. Published by by Elsevier Elsevier Ltd. Ltd. This is an open access article under the CC BY-NC-ND license Peer-review (http://creativecommons.org/licenses/by-nc-nd/4.0/). under responsibility of the Sakarya University. Peer-review under responsibility of the Sakarya University Keywords: Kinect, motion, interaction, embedded control, remote controlled cars 1. Introduction After Microsoft launched Kinect gesture sensors and the SDK, the Kinect motion technology has been combined with PC to apply new man-machine interface to daily life field, such as education, medical care, entertainment, sports, demonstrations and many other innovated applications. In most past applications, virtual objects in software are operated by Kinect gesture controllers via gesture action, such as roles of body controlled video games. In recent * Corresponding author. Tel.: +886-8-770-3202 E-mail address: lou@mail.npust.edu.tw 1877-0428 2015 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Sakarya University doi:10.1016/j.sbspro.2015.01.1047

Ke-Yu Lee et al. / Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 3103 years, applications of control hardware in Kinect gesture technology have increased. The Kinect remote controlled car discussed in this paper is an application example. Kinect remote controlled cars are derived from commercial remote toy cars. In the car body, Arduino control panel and interface control circuits are installed to control rotating direction of front and back motors of the car (Chang, Chen, & Hung, 2011). XBee wireless communication modules are installed to receive control signals from PC terminal. In combination with PC, Kinect gesture controller, XBee wireless communication modules, Arduino control panel and interface circuits, Kinect gesture sensors can extract skeleton information to fulfill movement and direction of the toy car, forward, backward, left and right. In the future, many industries will need many tech talents who are familiar with the gesture control technology. We also plan teaching contents of the motion interaction, and provide reference for the relevant curriculums. 2. Implementation of Kinect remote control car system Action of Kinect remote controlled cars is to use Kinect sensors to detect motion information, and convert it into control commands. Next, the computers send commands to Arduino control panel via XBee wireless communication modules. Next, the interface circuit drives motors for direction control. The hardware architecture of the Kinect remote controlled car is shown in Figure 1: comprising PC, Kinect sensor, XBee Explorer, XBee wireless communication module, Arduino control panel, interface circuit and actuating device (motor), and the description is as follows: 2.1. Kinect sensors Fig. 1. System architecture of Kinect remote controlled cars As shown in Figure 2, the Kinect sensor consists of three camera lenses, and in the middle there are common RGC color camera lens which can be used to identify identification or facial expression features of users. It can be also applied to augmented reality game and video calls; the left and right camera lenses are 3D depth sensor consisting of infrared transmitter and infrared CMOS camera (Filipe, Fernandes, Fernandes, Sousa, & Paredes, 2012). Kinect mainly uses 3D depth sensor to detect user motion. Kinect has the tracking function where the motor base can rotate with movement of focusing (Dutta, 2012). In addition, Kinect also has built-in array microphone system consisting of four microphones, provides noise function, and eliminates noise after radio comparison.

3104 Ke-Yu Lee et al. / Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 2.2. Arduino control panel Fig. 2. Kinect Sensor Arduino is an open microcomputer control panel. Due to low price, easy-to-use software development tool and rich network resources attract many engineers and interactive designers to design various novel and interesting interactive devices. Arduino UNO control panel in installed in Kinect remote controlled car (Nghiem, Auvinet, & Meunier, 2012). Eight-digit ATMEGA328 micro-controller is used as core, which provides 14-digit I/O terminal and 6 comparison I/O terminals, and supports USB data transmission. The users can connect different electronic devices on digit I/O terminal. 2.3. XBee wireless communication module XBee is wireless communication module of Digi firm based on IEEE 802.15.4, with working voltage of 3V. During use, X-CTU software must be performed to set XBee module parameters (Raheja, Chaudhary, & Singal, 2011). The point-to-point data transmission is used between PC and the Kinect remote controlled car. The parameter setting is shown in the Figure 3. The set Id of Xbee module of transfer terminal and receiving terminal is the same, and the baud rate is set to 9600bps. DOUT pin of XBee module on the Kinect remote controlled car is connected to RX pin of Arduino, and DIN connects TX pin. After detection of control gesture of users, Kinect sensor converts it into control command through programs. XBee wireless communication module of PC terminal sends the control command to XBee wireless receiver module connected to the Arduino control panel. At last, Arduino control panel drives the motor of the toy car according to the received control command to fulfill the action.

Ke-Yu Lee et al. / Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 3105 Fig. 3. Setting of Xbee parameters using X-CTU 2.4. Interface circuit The interface circuit core is L293D control core in which two H-Bridge circuits are arranged for control of rotation direction of front and rear DC motors (Xia, Chen, & Aggarwal, 2011). The power supply shall separate MCU from motor to prevent instability of circuit work. 2.5. Kinect motion control process Kinect application program can set Kinect sensor parameters through NUI Library, and extract sensor information, including: colored image information, depth image information and audio information (Frati & Prattichizzo, 2011), as shown in Figure 4. Image Stream Kinect Sensor Array DepthStream Audio Stream NUI Library Application Fig. 4. Interaction of application program with Kinect device via NUI library The control programs of remote controlled car are based on C# language. C# is an Object Oriented High-Level Computer Language launched by Microsoft based on (Yen, Suma, Newman, Rizzo, & Bolas, 2011). NET frame. The control programs define human skeleton nodes by the Kinect sensor as control command of remote controlled cars through Kinect sensors for direction control of remote controlled cars (Norman, Dale, & Bret, 2011). The control actions of the remote controlled cars are summarized in the following table:

3106 Ke-Yu Lee et al. / Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 Table 1. The control actions of the remote controlled cars. Gesture Lift left hand Lift right hand Extend right hand horizontally Extend left hand horizontally Put down your hands Action of remote controlled cars Forward Backward Turn right Turn left Stop 3. Course planning As interactive motion technologies are applied and needs of talents increase, most of universities and technical colleges lack the curriculum planning based on Kinect interactive motion technology. Thus, we conducted expert interview survey to collect learning background, textbook contents and units of the curriculums, and formulated the course units and syllabus based on the collected data. The course is experiment teaching. The course contents can be divided into theory and experiment, which can be described as follows: 3.1. Theory: introduce Kinect sensor, embedded microprocessors, and basic architecture and principle of C# language. 3.2. Experiment: The experiment includes: Control the tilt angle of Kinect base: use a simple method to connect application programs with Kinect sensor, and change Kinect sensor tilt angle through Kinect SDK standard library. Audio signal processing: identify and record the audio locations, voice recognition and voice synthesis. Colored image signal processing: learn color image stream processing and application. Depth stream processing: understand format and application of depth stream. Apply skeleton tracking function: obtain human skeleton coordinate information, and send the skeleton information to the application programs for advanced application. Serial communications for PC and embedded controllers: PC connects embedded controller through USB port. After conversion, it is connected with the two I/O PINs which is used as signal transfer pin (Tx) and receiver pin (Rx) to achieve serial communications. Switch ON/OFF light: Kinect voice recognition function is combined with embedded controller to control light switch. LED light display control via gesture: It is divided into different gestures. Kinect sensor detects movement positions of users arms, calculates spatial coordinates and controls several LED light display. Hand gesture controls rotation and rotating speed of motors: Different gestures are used to sense Kinect sensor to send human skeleton information to control rotation direction and speed of motors. Gesture remotely controlled toy cars: Kinect sensor uses different gestures for direction of remote controlled toy cars. 4. Conclusion Application and development of Kinect motion technology have been gradually penetrated into daily life, and have better perspective. The interactive course design and Kinect remote controlled cars can make students learn structure and working principle of Kinect sensors, PC control program design, signal transmission, interface circuit design, and embedded controller design technology to train students abilities of integrating systems, and train professional talents who are familiar with Kinect motion control.

Ke-Yu Lee et al. / Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 3107 References Chang, Y. J., Chen, S. F., & Huang, J. D. (2011). A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities. Research in Developmental Disabilities, 2566 2570. Filipe, V., Fernandes, F., Fernandes, H., Sousa, A., & Paredes, H. (2012). Blind navigation support system based on Microsoft Kinect. Procedia Computer Science, 14, 94 101. Dutta, T. (2012). Evaluation of the Kinect_ sensor for 3-D kinematic measurement in the workplace. Applied Ergonomics, 43, 645 649. Nghiem, A. T., Auvinet, E., & Meunier, J. (2012). Head detection using Kinect camera and its application to fall detection. The 11th International Conference on Information Science. 164 169. Raheja, J. L., Chaudhary, A., & Singal, K. (2011). Tracking of fingertips and centres of Palm using Kinect. Third International Conference on Computational Intelligence, Modelling & Simulation. 248 252. Xia, L., Chen, C. C., & Aggarwal, J. K. (2011). Human detection using depth information by Kinect. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). 15 22. Frati, V., & Prattichizzo, D. (2011). Using Kinect for hand tracking and rendering in wearable haptics. IEEE World Haptics Conference (WHC). 317 321. Yen, C., Suma, E., Newman, B., Rizzo, a. s., & Bolas, M. (2011). Development and evaluation of low cost game-based balance rehabilitation tool using the microsoft kinect sensor. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 1831 1834. Norman, V., Dale, R., & Bret, S. (2011). Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor. Proceedings of the 2011 conference on Information technology education. 227 232.